Robustly Learning Mixtures of Gaussians
Pravesh Kothari and Ankur Moitra (CMU and MIT)
Abstract: For a while now the problem of robustly learning a high-dimensional mixture of Gaussians has had a target on its back. The first works in algorithmic robust statistics gave provably robust algorithms for learning a single Gaussian. Since then there has been steady progress, including algorithms for robustly learning mixtures of spherical Gaussians, mixtures of Gaussians under separation conditions, and arbitrary mixtures of two Gaussians. In this talk we will discuss two recent works that essentially resolve the general problem. There are important differences in their techniques, setup, and overall quantitative guarantees, which we will discuss.
The talk will cover the following independent works: - Liu, Moitra, "Settling the Robust Learnability of Mixtures of Gaussians" - Bakshi, Diakonikolas, Jia, Kane, Kothari, Vempala, "Robustly Learning Mixtures of $k$ Arbitrary Gaussians"
computational complexitycomputational geometrycryptography and securitydiscrete mathematicsdata structures and algorithmsgame theorymachine learningquantum computing and informationcombinatoricsinformation theoryoptimization and controlprobability
Audience: researchers in the topic
Series comments: Description: Theoretical Computer Science
People can register to a talk via our webpage sites.google.com/site/plustcs/livetalk , or subscribe to our calendar and mailing list at sites.google.com/site/plustcs/rss-feeds
| Organizers: | Clément Canonne*, Anindya De, Sumegha Garg, Gautam Kamath, Ilya Razenshteyn, Oded Regev, Tselil Schramm, Thomas Vidick, Erik Waingarten |
| *contact for this listing |
