Scaling Up Exact Neural Network Compression by ReLU Stability

Thiago Serra (Bucknell University)

26-May-2021, 17:15-17:45 (5 years ago)

Abstract: We can compress a neural network while exactly preserving its underlying functionality with respect to a given input domain if some of its neurons are stable. However, current approaches to determine the stability of neurons require solving or finding a good approximation to multiple discrete optimization problems. In this talk, we present an algorithm based on solving a single optimization problem to identify all stable neurons. Our approach is on median 21 times faster than the state-of-art method, which allows us to explore exact compression on deeper (5 x 100) and wider (2 x 800) networks within minutes. For classifiers trained under an amount of L1 regularization that does not worsen accuracy, we can remove up to 40% of the connections.

This talk is based on joint work with Abhinav Kumar (Michigan State University) and Srikumar Ramalingam (Google Research).

game theorymachine learningmathematical softwarecomputer science theorycombinatoricsoptimization and control

Audience: researchers in the topic


Mixed Integer Programming Workshop 2021

Series comments: The 18th Mixed Integer Programming Workshop will be held online on May 24-27, 2021.

It will feature 21 distinguished invited speakers covering most aspects of Mathematical Optimization, an interactive, gamified MIP student poster session with 50 posters, and a casual business meeting.

Registration is free of charge. Register here: fico.zoom.us/webinar/register/2416186463858/WN_DVLhGOToQkKyvKYPiA4cQw

Find the website of MIP2021 at sites.google.com/view/mipworkshop2021/.

Organizers: Yuan Zhou*, Carla Michini, Robert Hildebrand, Yuri Faenza, Timo Berthold
*contact for this listing

Export talk to