Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic Perspectives

Michael Muehlebach (UC Berkeley)

01-Jul-2020, 16:50-17:15 (6 years ago)

Abstract: My talk will focus on the analysis of accelerated first-order optimization algorithms. I will show how the continuous dependence of the iterates with respect to their initial condition can be exploited to characterize the convergence rate. The result establishes criteria for accelerated convergence that are easily verifiable and applicable to a large class of first-order optimization algorithms. The analysis is not restricted to the convex setting and unifies discrete-time and continuous-time models. It also rigorously explains why structure-preserving discretization schemes are important for momentum-based algorithms.

machine learningdynamical systemsapplied physics

Audience: researchers in the topic


Workshop on Scientific-Driven Deep Learning (SciDL)

Series comments: When: 8:00-14:30pm (PST) on Wednesday July 1, 2020 Where: berkeley.zoom.us/j/95609096856 Details: scidl.netlify.app/

Organizers: N. Benjamin Erichson*, Michael Mahoney, Steven Brunton, Nathan Kutz
*contact for this listing

Export talk to