BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Xin Tong (National University of Singapore)
DTSTART:20200804T130000Z
DTEND:20200804T140000Z
DTSTAMP:20260423T034447Z
UID:DSCSS/5
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/DSCSS/5/">Ca
 n algorithms collaborate? The replica exchange method</a>\nby Xin Tong (Na
 tional University of Singapore) as part of Data Science and Computational 
 Statistics Seminar\n\n\nAbstract\nGradient descent (GD) is known to conver
 ge quickly for convex objective functions\, but it can be trapped at local
  minima. On the other hand\, Langevin dynamics (LD) can explore the state 
 space and find global minima\, but in order to give accurate estimates\, L
 D needs to run with a small discretization step size and weak stochastic f
 orce\, which in general slow down its convergence. This paper shows that t
 hese two algorithms can ``collaborate” through a simple exchange mechani
 sm\, in which they swap their current positions if LD yields a lower objec
 tive function. This idea can be seen as the singular limit of the replica-
 exchange technique from the sampling literature. We show that this new alg
 orithm converges to the global minimum linearly with high probability\, as
 suming the objective function is strongly convex in a neighborhood of the 
 unique global minimum. By replacing gradients with stochastic gradients\, 
 and adding a proper threshold to the exchange mechanism\, our algorithm ca
 n also be used in online settings. We further verify our theoretical resul
 ts through some numerical experiments\, and observe superior performance o
 f the proposed algorithm over running GD or LD alone.\n
LOCATION:https://researchseminars.org/talk/DSCSS/5/
END:VEVENT
END:VCALENDAR
