BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Michael Unser (École polytechnique fédérale de Lausanne\, CH)
DTSTART:20200608T120000Z
DTEND:20200608T124500Z
DTSTAMP:20260423T005745Z
UID:OWMADS/6
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/OWMADS/6/">R
 epresenter theorems for machine learning and inverse problems</a>\nby Mich
 ael Unser (École polytechnique fédérale de Lausanne\, CH) as part of On
 e World seminar: Mathematical Methods for Arbitrary Data Sources (MADS)\n\
 n\nAbstract\nRegularization addresses the ill-posedness of the training pr
 oblem in machine learning or the reconstruction of a signal from a limited
  number of measurements. The standard strategy consists in augmenting the 
 original cost functional by an energy that penalizes solutions with undesi
 rable behaviour. In this presentation\, I will present a general represent
 er theorem that characterizes the solutions of a remarkably broad class of
  optimization problems in Banach spaces and helps us understand the effect
  of regularization. I will then use the theorem to retrieve some classical
  characterizations such as the celebrated representer theorem of machine l
 eaning for RKHS\, Tikhonov regularization\, representer theorems for spars
 ity promoting functionals\, as well as a few new ones\, including a result
  for deep neural networks.\n
LOCATION:https://researchseminars.org/talk/OWMADS/6/
END:VEVENT
END:VCALENDAR
