BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Petar Veličković (DeepMind and University of Cambridge)
DTSTART:20220929T160000Z
DTEND:20220929T170000Z
DTSTAMP:20260423T003242Z
UID:MPML/78
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MPML/78/">Ge
 ometric Deep Learning: Grids\, Graphs\, Groups\, Geodesics and Gauges</a>\
 nby Petar Veličković (DeepMind and University of Cambridge) as part of M
 athematics\, Physics and Machine Learning (IST\, Lisbon)\n\n\nAbstract\nTh
 e last decade has witnessed an experimental revolution in data science and
  machine learning\, epitomised by deep learning methods. Indeed\, many hig
 h-dimensional learning tasks previously thought to be beyond reach –such
  as computer vision\, playing Go\, or protein folding – are in fact feas
 ible with appropriate computational scale. Remarkably\, the essence of dee
 p learning is built from two simple algorithmic principles: first\, the no
 tion of representation or feature learning\, whereby adapted\, often hiera
 rchical\, features capture the appropriate notion of regularity for each t
 ask\, and second\, learning by local gradient-descent type methods\, typic
 ally implemented as backpropagation.\n\nWhile learning generic functions i
 n high dimensions is a cursed estimation problem\, most tasks of interest 
 are not generic\, and come with essential pre-defined regularities arising
  from the underlying low-dimensionality and structure of the physical worl
 d. This talk is concerned with exposing these regularities through unified
  geometric principles that can be applied throughout a wide spectrum of ap
 plications.\n\nSuch a 'geometric unification' endeavour in the spirit of F
 elix Klein's Erlangen Program serves a dual purpose: on one hand\, it prov
 ides a common mathematical framework to study the most successful neural n
 etwork architectures\, such as CNNs\, RNNs\, GNNs\, and Transformers. On t
 he other hand\, it gives a constructive procedure to incorporate prior phy
 sical knowledge into neural architectures and provide principled way to bu
 ild future architectures yet to be invented.\n
LOCATION:https://researchseminars.org/talk/MPML/78/
END:VEVENT
END:VCALENDAR
