How to Trap a Gradient Flow

Sébastien Bubeck (Microsoft Research)

24-Apr-2020, 15:00-16:00 (6 years ago)

Abstract: In 1993, Stephen A. Vavasis proved that in any finite dimension, there exists a faster method than gradient descent to find stationary points of smooth non-convex functions. In dimension 2 he proved that 1/eps gradient queries are enough, and that 1/sqrt(eps) queries are necessary. We close this gap by providing an algorithm based on a new local-to-global phenomenon for smooth non-convex functions. Some higher dimensional results will also be discussed. I will also present an extension of the 1/sqrt(eps) lower bound to randomized algorithms, mainly as an excuse to discuss some beautiful topics such as Aldous’ 1983 paper on local minimization on the cube, and Benjamini-Pemantle-Peres’ 1998 construction of unpredictable walks.

Joint work with Dan Mikulincer

statistics theory

Audience: researchers in the topic


Stochastics and Statistics Seminar Series

Series comments: Description: MIT seminar on statistics, data science and related topics

Organizers: Philippe Rigollet*, Sasha Rakhlin
*contact for this listing

Export talk to