BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:René Vidal (Mathematical Institute for Data Science\, Johns Hopki
 ns University)
DTSTART:20201216T180000Z
DTEND:20201216T190000Z
DTSTAMP:20260423T003241Z
UID:MPML/23
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MPML/23/">Fr
 om Optimization Algorithms to Dynamical Systems and Back</a>\nby René Vid
 al (Mathematical Institute for Data Science\, Johns Hopkins University) as
  part of Mathematics\, Physics and Machine Learning (IST\, Lisbon)\n\n\nAb
 stract\nRecent work has shown that tools from dynamical systems can be use
 d to analyze accelerated optimization algorithms. For example\, it has bee
 n shown that the continuous limit of Nesterov’s accelerated gradient (NA
 G) gives an ODE whose convergence rate matches that of NAG for convex\, un
 constrained\, and smooth problems. Conversely\, it has been shown that NAG
  can be obtained as the discretization of an ODE\, however since different
  discretizations lead to different algorithms\, the choice of the discreti
 zation becomes important. The first part of this talk will extend this typ
 e of analysis to convex\, constrained and non-smooth problems by using Lya
 punov stability theory to analyze continuous limits of the Alternating Dir
 ection Method of Multipliers (ADMM). The second part of this talk will sho
 w that many existing and new optimization algorithms can be obtained by su
 itably discretizing a dissipative Hamiltonian. As an example\, we will pre
 sent a new method called Relativistic Gradient Descent (RGD)\, which empir
 ically outperforms momentum\, RMSprop\, Adam and AdaGrad on several non-co
 nvex\nproblems.\n\nThis is joint work with Guilherme Franca\, Daniel Robin
 son and Jeremias Sulam.\n
LOCATION:https://researchseminars.org/talk/MPML/23/
END:VEVENT
END:VCALENDAR
