BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Stephan Wojtowytsch (Princeton University)
DTSTART:20210223T180000Z
DTEND:20210223T190000Z
DTSTAMP:20260423T035026Z
UID:OSGA/49
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/OSGA/49/">Op
 timal transport for non-convex optimization in machine learning</a>\nby St
 ephan Wojtowytsch (Princeton University) as part of Online Seminar "Geomet
 ric Analysis"\n\n\nAbstract\nFunction approximation is a classical task in
  both classical numerical analysis and machine learning. Elements of the r
 ecently popular class of neural networks depend nonlinearly on a finite se
 t of parameters. This nonlinearity gives the function class immense approx
 imation power\, but causes parameter optimization problems to be non-conve
 x. In fact\, generically the set of global minimizers is a (curved) manifo
 ld of positive dimension. Despite this non-convexity\, gradient descent ba
 sed algorithms empirically find good minimizers in many applications. We d
 iscuss this surprising success of simple optimization algorithms from the 
 perspective of Wasserstein gradient flows in the case of shallow neural ne
 tworks in the infinite parameter limit.\n
LOCATION:https://researchseminars.org/talk/OSGA/49/
END:VEVENT
END:VCALENDAR
