Properly learning decision trees in almost polynomial time
Guy Blanc (Stanford University)
Abstract: We give an $n^{O(\log\log n)}$-time membership query algorithm for properly and agnostically learning decision trees under the uniform distribution over $\{-1,1\}^n$. Even in the realizable setting, the previous fastest runtime was $n^{O(\log n)}$, a consequence of a classic algorithm of Ehrenfeucht and Haussler.
Our algorithm shares similarities with practical heuristics for learning decision trees, which we augment with additional ideas to circumvent known lower bounds against these heuristics. To analyze our algorithm, we prove a new structural result for decision trees that strengthens a theorem of O'Donnell, Saks, Schramm, and Servedio. While the OSSS theorem says that every decision tree has an influential variable, we show how every decision tree can be "pruned" so that every variable in the resulting tree is influential.
Joint work with Jane Lange, Mingda Qiao, and Li-Yang Tan. To appear in FOCS 2021.
computational complexitycomputational geometrycryptography and securitydiscrete mathematicsdata structures and algorithmsgame theorymachine learningquantum computing and informationcombinatoricsinformation theoryoptimization and controlprobability
Audience: researchers in the topic
( paper )
Series comments: Description: Theoretical Computer Science
People can register to a talk via our webpage sites.google.com/site/plustcs/livetalk , or subscribe to our calendar and mailing list at sites.google.com/site/plustcs/rss-feeds
| Organizers: | Clément Canonne*, Anindya De, Sumegha Garg, Gautam Kamath, Ilya Razenshteyn, Oded Regev, Tselil Schramm, Thomas Vidick, Erik Waingarten |
| *contact for this listing |
