Globally injective deep neural networks

Maarten de Hoop (Rice University)

09-Jul-2020, 16:00-17:00 (4 years ago)

Abstract: We present an analysis of injective, ReLU, deep neural networks. We establish sharp conditions for injectivity of ReLU layers and networks, both fully connected and convolutional. We show through a layer-wise analysis that an expansivity factor of two is necessary for injectivity; we also show sufficiency by constructing weight matrices which guarantee injectivity. Further, we show that global injectivity with iid Gaussian matrices, a commonly used tractable model, requires considerably larger expansivity. We then derive the inverse Lipschitz constant and study the approximation-theoretic properties of injective neural networks. Using arguments from differential topology we prove that, under mild technical conditions, any Lipschitz map can be approximated by an injective neural network. This justifies the use of injective neural networks in problems which a priori do not require injectivity.

Joint work with M. Puthawala, K. Kothari, M. Lassas and I. Dokmani\'{c}.

Mathematics

Audience: researchers in the topic


International Zoom Inverse Problems Seminar, UC Irvine

Organizers: Katya Krupchyk*, Knut Solna
*contact for this listing

Export talk to