Statistical Learning by Stochastic Gradient Descent
Yunwen Lei (University of Kaiserslautern)
18-Aug-2020, 12:00-13:00 (5 years ago)
Abstract: Stochastic gradient descent (SGD) has become the workhorse behind many machine learning problems. Optimization and estimation errors are two contradictory factors responsible for the prediction behavior of SGD . In this talk, we report our generalization analysis of SGD by considering simultaneously the optimization and estimation errors. We remove some restrictive assumptions in the literature and significantly improve the existing generalization bounds. Our results help to understand how to stop SGD early to get a best generalization performance.
Computer scienceMathematicsPhysics
Audience: researchers in the topic
Data Science and Computational Statistics Seminar
| Organizers: | Hong Duong*, Jinming Duan, Jinglai Li, Xiaocheng Shang |
| *contact for this listing |
Export talk to
