BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Thiago Serra (Bucknell University)
DTSTART:20210526T171500Z
DTEND:20210526T174500Z
DTSTAMP:20260414T235903Z
UID:MIP2021/16
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MIP2021/16/"
 >Scaling Up Exact Neural Network Compression by ReLU Stability</a>\nby Thi
 ago Serra (Bucknell University) as part of Mixed Integer Programming Works
 hop 2021\n\n\nAbstract\nWe can compress a neural network while exactly pre
 serving its underlying functionality with respect to a given input domain 
 if some of its neurons are stable. However\, current approaches to determi
 ne the stability of neurons require solving or finding a good approximatio
 n to multiple discrete optimization problems. In this talk\, we present an
  algorithm based on solving a single optimization problem to identify all 
 stable neurons. Our approach is on median 21 times faster than the state-o
 f-art method\, which allows us to explore exact compression on deeper (5 x
  100) and wider (2 x 800) networks within minutes. For classifiers trained
  under an amount of L1 regularization that does not worsen accuracy\, we c
 an remove up to 40% of the connections. \n\nThis talk is based on joint wo
 rk with Abhinav Kumar (Michigan State University) and Srikumar Ramalingam 
 (Google Research).\n
LOCATION:https://researchseminars.org/talk/MIP2021/16/
END:VEVENT
END:VCALENDAR
