Learning to Scale

Timo Berthold (FICO)

30-Jun-2020, 18:00-18:30 (4 years ago)

Abstract: Scaling is a widely used preconditioning technique, used to reduce error propagation and thereby improve the numerical behavior of an algorithm. For numerically challenging mixed-integer programs (MIPs), as they appear in many practical applications, having an efficient scaling method in place often makes the difference whether the MIP's LP relaxations can be solved gracefully or not. There are two scaling methods which are commonly used: Standard scaling and Curtis-Reid scaling. The latter often, but not always, leads to a more robust solution process, but also to longer solution times. We introduce a method to automatically choose between the two scaling variants by predicting which one will lead to fewer numerical issues. It turns out that this not only reduces various types of numerical errors, but is also performance-neutral for MIPs and improves performance on LPs.

optimization and control

Audience: researchers in the topic


Discrete Optimization Talks

Series comments: DOTs are virtual discrete optimization talks, organized by Aleksandr M. Kazachkov and Elias B. Khalil. To receive updates about upcoming DOTs, please join our mailing list. Topics of interest include theoretical, computational, and applied aspects of integer and combinatorial optimization.

The format is two thirty-minute talks and time for questions. Currently (January-April 2021), the seminars are scheduled on end-of-the-month Fridays at 1:00 p.m. ET. A special feature of DOTs is a social component. After a usual talk, you might grab a tea/coffee and chat with other attendees. Why not here too? Join us for some informal discussion after each DOT and throughout the week on our Discord channel.

Organizers: Discrete Optimization Talks*, Aleksandr M. Kazachkov*, Elias B. Khalil
*contact for this listing

Export talk to