BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Nicolas Papadakis (University of Bordeaux)
DTSTART:20210608T101500Z
DTEND:20210608T114500Z
DTSTAMP:20260423T022603Z
UID:MathDeep/8
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MathDeep/8/"
 >On the learning of Wasserstein generative models</a>\nby Nicolas Papadaki
 s (University of Bordeaux) as part of Mathematics of Deep Learning\n\n\nAb
 stract\nThe problem of WGAN (Wasserstein Generative Adversarial Network) l
 earning is an instance of optimization problems where one wishes to find\,
  among a parametric class of distributions\, the one which is closest to a
  target distribution in terms of an optimal transport (OT) distance. Apply
 ing a gradient-based algorithm for this problem requires to express the gr
 adient of the OT distance with respect to one of its argument\, which can 
 be related to the solutions of the dual problem (Kantorovich potentials). 
 The first part of this talk aims at finding conditions that ensure the exi
 stence of such gradient. After discussing regularity issues that may appea
 r with discrete target measures\, we will show that regularity problems ar
 e avoided when using entropy-regularized OT and/or considering the semi-di
 screte formulation of OT. Then\, we will see how these gradients can be ex
 ploited in a stable way to address some imaging problems where the target 
 discrete measure is reasonably large. Using OT distances between multi-sca
 le patch distributions\, this allows to estimate a generative convolutiona
 l network that can synthesize an exemplar texture in a faithful and effici
 ent way.\nThis is a joint work with Antoine Houdard\, Arthue Leclaire and 
 Julien Rabin.\n
LOCATION:https://researchseminars.org/talk/MathDeep/8/
END:VEVENT
END:VCALENDAR
