BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Ismael Lemhadri (Stanford University)
DTSTART:20200709T150000Z
DTEND:20200709T160000Z
DTSTAMP:20260423T003255Z
UID:ProbStat/7
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/ProbStat/7/"
 >LassoNet: A Neural Network with Feature Sparsity</a>\nby Ismael Lemhadri 
 (Stanford University) as part of Probability & Statistics  (IST-CEMAT\, FC
 -CEAUL\, ULisbon)\n\n\nAbstract\nMuch work has been done recently to make 
 neural networks more interpretable\, and one obvious approach is to arrang
 e for the network to use only a subset of the available features. In linea
 r models\, Lasso (or $\\ell_1$-regularized) regression assigns zero weight
 s to the most irrelevant or redundant features\, and is widely used in dat
 a science. However the Lasso only applies to linear models. Here we introd
 uce LassoNet\, a neural network framework with global feature selection. O
 ur approach enforces a hierarchy: specifically a feature can participate i
 n a hidden unit only if its linear representative is active. Unlike other 
 approaches to feature selection for neural nets\, our method uses a modifi
 ed objective function with constraints\, and so integrates feature selecti
 on with the parameter learning directly. As a result\, it delivers an enti
 re regularization path of solutions with a range of feature sparsity. On s
 ystematic experiments\, LassoNet significantly outperforms state-of-the-ar
 t methods for feature selection and regression. The LassoNet method uses p
 rojected proximal gradient descent\, and generalizes directly to deep netw
 orks. It can be implemented by adding just a few lines of code to a standa
 rd neural network.\n
LOCATION:https://researchseminars.org/talk/ProbStat/7/
END:VEVENT
END:VCALENDAR
