BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Francis Bach (Inria\, FR)
DTSTART:20200518T130000Z
DTEND:20200518T134500Z
DTSTAMP:20260423T024538Z
UID:OWMADS/9
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/OWMADS/9/">O
 n the convergence of gradient descent for wide two-layer neural networks</
 a>\nby Francis Bach (Inria\, FR) as part of One World seminar: Mathematica
 l Methods for Arbitrary Data Sources (MADS)\n\n\nAbstract\nMany supervised
  learning methods are naturally cast as optimization problems. For predict
 ion models which are linear in their parameters\, this often leads to conv
 ex problems for which many guarantees exist. Models which are non-linear i
 n their parameters such as neural networks lead to non-convex optimization
  problems for which guarantees are harder to obtain. In this talk\, I will
  consider two-layer neural networks with homogeneous activation functions 
 where the number of hidden neurons tends to infinity\, and show how qualit
 ative convergence guarantees may be derived. I will also highlight open pr
 oblems related to the quantitative behavior of gradient descent for such m
 odels. (Based on joint work with Lénaïc Chizat\, https://arxiv.org/abs/1
 805.09545\, https://arxiv.org/abs/2002.04486)\n\nPlease note that this is 
 a joint talk with the One World Optimization Seminar.\n
LOCATION:https://researchseminars.org/talk/OWMADS/9/
END:VEVENT
END:VCALENDAR
