BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Umberto Picchini (Chalmers University of Technology & University o
 f Gothenburg)
DTSTART:20240515T111500Z
DTEND:20240515T120000Z
DTSTAMP:20260422T155052Z
UID:gbgstats/54
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/gbgstats/54/
 ">Fast\, lightweight and semi-amortised simulation-based inference</a>\nby
  Umberto Picchini (Chalmers University of Technology & University of Gothe
 nburg) as part of Gothenburg statistics seminar\n\nLecture held in MVL14.\
 n\nAbstract\nBayesian inference for complex models with an intractable lik
 elihood can be tackled using algorithms performing many calls to computer 
 simulators. These approaches are collectively known as "simulation-based i
 nference" (SBI). Recent SBI methods use neural-conditional-estimation\, th
 at is neural networks are employed to provide approximations to the likeli
 hood function or the posterior distribution of model parameters. While neu
 ral-based posterior and likelihood estimation methods have produced except
 ionally flexible inference strategies\, these can be computationally inten
 sive to run and have a non-negligible impact on energy expenditure and mem
 ory requirements. In this work\, rather than using neural networks we prop
 ose more "frugal" strategies that display state-of-art inference quality\,
  while being able to run with limited resources\, being much faster to tra
 in and exhibiting a much smaller computational footprint. We investigate s
 tructured mixtures of probability distributions and design a new SBI metho
 d named Sequential Mixture Posterior and Likelihood Estimation (SeMPLE). S
 eMPLE learns closed-form approximations for both the posterior $p(θ|y)$ a
 nd the likelihood $p(y|θ)$ from the same training data\, using Gaussian m
 ixture models that can be efficiently learned.\nWe show favorable results 
 for a variety of stochastic models (including SDEs and Markov jump process
 es)\, also in presence of multimodal posteriors. \n\nThe talk will be appr
 oachable for the uninitiated audience\, while novel results will be of int
 erest for the experienced audience.\n\nJoint work with Henrik Häggström\
 , Pedro L. C. Rodrigues\, Geoffroy Oudoumanessah and Florence Forbes\, htt
 ps://arxiv.org/abs/2403.07454\n
LOCATION:https://researchseminars.org/talk/gbgstats/54/
END:VEVENT
END:VCALENDAR
