Asymptotic error analysis of stochastic optimization schemes
Charles-Edouard Bréhier (Université de Pau et des Pays de l'Adour)
Abstract: Stochastic optimization algorithms are nowadays widely used, especially in the machine learning community. In this talk, we study a class of stochastic optimization schemes which are perturbations of gradient descent algorithms. We perform a rigorous analysis of the convergence, with proofs of error bounds with respect to the time-step size, in the large time regime, in the case of strongly convex objective functions. The error bounds follow from an interpretation of the schemes in terms of deterministic and stochastic modified equations, and using tools from weak error analysis of numerical methods for stochastic differential equations.
numerical analysisoptimization and controlprobability
Audience: researchers in the topic
Series comments: Online streaming via zoom on exceptional cases if requested. Please contact the organizers at the latest Monday 11:45.
| Organizers: | David Cohen*, Annika Lang* |
| *contact for this listing |
