BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Yunwen Lei (University of Kaiserslautern)
DTSTART:20200818T120000Z
DTEND:20200818T130000Z
DTSTAMP:20260423T034448Z
UID:DSCSS/7
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/DSCSS/7/">St
 atistical Learning by Stochastic Gradient Descent</a>\nby Yunwen Lei (Univ
 ersity of Kaiserslautern) as part of Data Science and Computational Statis
 tics Seminar\n\n\nAbstract\nStochastic gradient descent (SGD) has become t
 he workhorse behind many machine learning problems. Optimization and estim
 ation errors are two contradictory factors responsible for the prediction 
 behavior of SGD . In this talk\, we report our generalization analysis of 
 SGD by considering simultaneously the optimization and estimation errors. 
 We remove some restrictive assumptions in the literature and significantly
  improve the existing generalization bounds. Our results help to understan
 d how to stop SGD early to get a best generalization performance.\n
LOCATION:https://researchseminars.org/talk/DSCSS/7/
END:VEVENT
END:VCALENDAR
