On the learning of Wasserstein generative models

Nicolas Papadakis (University of Bordeaux)

08-Jun-2021, 10:15-11:45 (5 years ago)

Abstract: The problem of WGAN (Wasserstein Generative Adversarial Network) learning is an instance of optimization problems where one wishes to find, among a parametric class of distributions, the one which is closest to a target distribution in terms of an optimal transport (OT) distance. Applying a gradient-based algorithm for this problem requires to express the gradient of the OT distance with respect to one of its argument, which can be related to the solutions of the dual problem (Kantorovich potentials). The first part of this talk aims at finding conditions that ensure the existence of such gradient. After discussing regularity issues that may appear with discrete target measures, we will show that regularity problems are avoided when using entropy-regularized OT and/or considering the semi-discrete formulation of OT. Then, we will see how these gradients can be exploited in a stable way to address some imaging problems where the target discrete measure is reasonably large. Using OT distances between multi-scale patch distributions, this allows to estimate a generative convolutional network that can synthesize an exemplar texture in a faithful and efficient way. This is a joint work with Antoine Houdard, Arthue Leclaire and Julien Rabin.

machine learningnumerical analysisoptimization and control

Audience: researchers in the topic


Mathematics of Deep Learning

Series comments: Please fill out the following form for registering for our email list, where talk announcements and zoom details are distributed: docs.google.com/forms/d/e/1FAIpQLSeWAzBXsXRqpJhHDKODywySl_BWZN-Cbrik_4bEun2fGwhOKg/viewform?usp=sf_link

Slides: drive.google.com/drive/folders/1w9lNCGWZyzGFxxuVvhJOcjlc92X2toJg?usp=sharing

Videos: www.fau.tv/course/id/878

Organizers: Leon Bungert*, Daniel Tenbrinck
Curator: Martin Burger
*contact for this listing

Export talk to