BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Ruth Misener (Imperial College London)
DTSTART:20210618T130000Z
DTEND:20210618T140000Z
DTSTAMP:20260423T003244Z
UID:MPML/47
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MPML/47/">Pa
 rtition-based formulations for mixed-integer optimization of trained ReLU 
 neural networks</a>\nby Ruth Misener (Imperial College London) as part of 
 Mathematics\, Physics and Machine Learning (IST\, Lisbon)\n\n\nAbstract\nT
 his work develops a class of relaxations in between the big-M and convex h
 ull formulations of disjunctions\, drawing advantages from both. We show t
 hat this class leads to mixed-integer formulations for trained ReLU neural
  networks. The approach balances model size and tightness by partitioning 
 node inputs into a number of groups and forming the convex hull over the p
 artitions via disjunctive programming. At one extreme\, one partition per 
 input recovers the convex hull of a node\, i.e.\, the tightest possible fo
 rmulation for each node. For fewer partitions\, we develop smaller relaxat
 ions that approximate the convex hull\, and show that they outperform exis
 ting formulations. Specifically\, we propose strategies for partitioning v
 ariables based on theoretical motivations and validate these strategies us
 ing extensive computational experiments. Furthermore\, the proposed scheme
  complements known algorithmic approaches\, e.g.\, optimization-based boun
 d tightening captures dependencies within a partition.\n\nThis joint work 
 with Calvin Tsay\, Jan Kronqvist\, Alexander Thebelt is based on two paper
 s: https://arxiv.org/abs/2102.04373  & https://arxiv.org/abs/2101.12708\n
LOCATION:https://researchseminars.org/talk/MPML/47/
END:VEVENT
END:VCALENDAR
