BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Maarten de Hoop (Rice University)
DTSTART:20200709T160000Z
DTEND:20200709T170000Z
DTSTAMP:20260423T021148Z
UID:Inverse/11
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Inverse/11/"
 >Globally injective deep neural networks</a>\nby Maarten de Hoop (Rice Uni
 versity) as part of International Zoom Inverse Problems Seminar\, UC Irvin
 e\n\n\nAbstract\nWe present an analysis of injective\, ReLU\, deep neural 
 networks. We establish sharp conditions for injectivity of ReLU layers and
  networks\, both fully connected and convolutional. We show through a laye
 r-wise analysis that an expansivity factor of two is necessary for injecti
 vity\; we also show sufficiency by constructing weight matrices which guar
 antee injectivity. Further\, we show that global injectivity with iid Gaus
 sian matrices\, a commonly used tractable model\, requires considerably la
 rger expansivity. We then derive the inverse Lipschitz constant and study 
 the approximation-theoretic properties of injective neural networks. Using
  arguments from differential topology we prove that\, under mild technical
  conditions\, any Lipschitz map can be approximated by an injective neural
  network. This justifies the use of injective neural networks in problems 
 which a priori do not require injectivity.\n\nJoint work with M. Puthawala
 \, K. Kothari\, M. Lassas and I. Dokmani\\'{c}.\n
LOCATION:https://researchseminars.org/talk/Inverse/11/
END:VEVENT
END:VCALENDAR
