Accelerated gradient methods on Riemannian manifolds

Suvrit Sra (MIT)

03-Mar-2021, 20:45-21:45 (5 years ago)

Abstract: This talk lies at the interface of geometry and optimization. I'll talk about geodesically convex optimization problems, a rich class of non-convex optimization problems that admit tractable global optimization. I'll provide some background on this class and some motivating examples. Beyond a general introduction to the topic area, I will dive deeper into a recent discovery of a long-sought result: an accelerated gradient method for Riemannian manifolds. Towards developing this method, we will revisit Nesterov's (Euclidean) estimate sequence technique and present a conceptually simple alternative. We will then generalize this simpler alternative to the Riemannian setting. Combined with a new geometric inequality, we will then obtain the first (global) accelerated Riemannian-gradient method. I'll also comment on some very recent updates on this topic.

algebraic geometrydifferential geometryquantum algebrasymplectic geometry

Audience: researchers in the topic


Boston University Geometry/Physics Seminar

Series comments: Please email Yu-Shen Lin (yslin0221@gmail.com) for password or adding to the email list.

Organizer: Yu-Shen Lin*
*contact for this listing

Export talk to