Best Submatrix Selection: Strong Formulations and Approximation Algorithms

Weijun Xie (Virginia Tech)

02-Jun-2020, 18:00-18:30 (4 years ago)

Abstract: Many interesting machine learning and data analytics problems involve selecting the most informative principal submatix of of a prespecified size from a covariance matrix, such as maximum entropy sampling problem, experimental design, sparse PCA. Although directly formulating these problems into mathematical programs is difficult, we explore the Cholesky factorization of the original covariance matrix and recast the problems as mixed integer programs (MIPs). We also show that (i) these new MIPs usually have tight continuous relaxation bounds, and (ii) by constructing dual solutions, we can prove approximation bounds of the local search algorithm. Our numerical experiments demonstrate that these approximation algorithms can efficiently solve medium-sized and large-scale instances to near-optimality.

optimization and control

Audience: researchers in the topic

( paper )


Discrete Optimization Talks

Series comments: DOTs are virtual discrete optimization talks, organized by Aleksandr M. Kazachkov and Elias B. Khalil. To receive updates about upcoming DOTs, please join our mailing list. Topics of interest include theoretical, computational, and applied aspects of integer and combinatorial optimization.

The format is two thirty-minute talks and time for questions. Currently (January-April 2021), the seminars are scheduled on end-of-the-month Fridays at 1:00 p.m. ET. A special feature of DOTs is a social component. After a usual talk, you might grab a tea/coffee and chat with other attendees. Why not here too? Join us for some informal discussion after each DOT and throughout the week on our Discord channel.

Organizers: Discrete Optimization Talks*, Aleksandr M. Kazachkov*, Elias B. Khalil
*contact for this listing

Export talk to