BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Adrien Corenflos (University of Warwick)
DTSTART:20240529T111500Z
DTEND:20240529T120000Z
DTSTAMP:20260422T155153Z
UID:gbgstats/57
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/gbgstats/57/
 ">Particle-MALA and Particle-mGrad: Gradient-based MCMC methods for high-d
 imensional state-space models</a>\nby Adrien Corenflos (University of Warw
 ick) as part of Gothenburg statistics seminar\n\nLecture held in MVL14.\n\
 nAbstract\nState-of-the-art methods for Bayesian inference in state-space 
 models are (a) conditional sequential Monte Carlo (CSMC) algorithms\; (b) 
 sophisticated 'classical' MCMC algorithms like MALA\, or mGRAD from Titsia
 s and Papaspiliopoulos (2018). The former propose N particles at each time
  step to exploit the model's 'decorrelation-over-time' property and thus s
 cale favourably with the time horizon\, T\, but break down if the dimensio
 n of the latent states\, D\, is large. The latter leverage gradient/prior-
 informed local proposals to scale favourably with D but exhibit sub-optima
 l scalability with T due to a lack of model-structure exploitation. We int
 roduce methods which combine the strengths of both approaches. The first\,
  Particle-MALA\, spreads N particles locally around the current state usin
 g gradient information\, thus extending MALA to T>1 time steps and N>1 pro
 posals. The second\, Particle-mGRAD\, additionally incorporates (condition
 ally) Gaussian prior dynamics into the proposal\, thus extending the mGRAD
  algorithm. We prove that Particle-mGRAD interpolates between CSMC and Par
 ticle-MALA\, resolving the 'tuning problem' of choosing between CSMC (supe
 rior for highly informative prior dynamics) and Particle-MALA (superior fo
 r weakly informative prior dynamics). We similarly extend other 'classical
 ' MCMC approaches like auxiliary MALA\, aGRAD\, and preconditioned Crank-N
 icolson-Langevin (PCNL). In experiments\, our methods substantially improv
 e upon both CSMC and sophisticated `classical' MCMC approaches for both hi
 ghly and weakly informative prior dynamics.\n\nTL\;DR: We aim to solve the
  curse of dimensionality in state-space model inferences by combining the 
 nice property (in time) of conditional particle filtering methods\, with t
 he nice property (in space) of MALA and other gradient-based algorithms.\n
LOCATION:https://researchseminars.org/talk/gbgstats/57/
END:VEVENT
END:VCALENDAR
