BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Christoph Hertrich (LSE)
DTSTART:20230419T150000Z
DTEND:20230419T160000Z
DTSTAMP:20260423T035419Z
UID:CompAlg/12
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/CompAlg/12/"
 >Understanding Neural Network Expressivity via Polyhedral Geometry</a>\nby
  Christoph Hertrich (LSE) as part of Machine Learning Seminar\n\n\nAbstrac
 t\nNeural networks with rectified linear unit (ReLU) activations are one o
 f the standard models in modern machine learning. Despite their practical 
 importance\, fundamental theoretical questions concerning ReLU networks re
 main open until today. For instance\, what is the precise set of (piecewis
 e linear) functions exactly representable by ReLU networks with a given de
 pth? Even the special case asking for the number of layers to compute a fu
 nction as simple as $\\max\\{0\, x_1\, x_2\, x_3\, x_4\\}$ has not been so
 lved yet. In this talk we will explore the relevant background to understa
 nd this question and report about recent progress using tropical and polyh
 edral geometry as well as a computer-aided approach based on mixed-integer
  programming. This is based on joint works with Amitabh Basu\, Marco Di Su
 mma\, and Martin Skutella (NeurIPS 2021)\, as well as Christian Haase and 
 Georg Loho (ICLR 2023).\n
LOCATION:https://researchseminars.org/talk/CompAlg/12/
END:VEVENT
END:VCALENDAR
