Distributed Statistical Inference under Local Information Constraints
Jayadev Acharya (Cornell)
Abstract: We consider statistical inference tasks in a distributed setting where access to data samples is subjected to strict "local constraints," through a unified framework that captures communication limitations and (local) privacy constraints as special cases. We study estimation (learning) and goodness-of-fit (testing) for both discrete and high-dimensional distributions. Our goal is to understand how the sample complexity increases under the information constraints.
In this talk we will provide an overview of this field and a sample of some of our results. We will discuss the role of (public) randomness and interactivity in information-constrained inference, and make a case for thinking about randomness and interactivity as resources.
The work is part of a long-term ongoing collaboration with Clément Canonne (IBM Research) and Himanshu Tyagi (IISc), and includes works done with Cody Freitag (Cornell), Yanjun Han (Stanford), Yuhan Liu (Cornell), and Ziteng Sun (Cornell).
computational complexitycomputational geometrycryptography and securitydiscrete mathematicsdata structures and algorithmsgame theorymachine learningquantum computing and informationcombinatoricsinformation theoryoptimization and controlprobability
Audience: researchers in the topic
Series comments: Description: Theoretical Computer Science
People can register to a talk via our webpage sites.google.com/site/plustcs/livetalk , or subscribe to our calendar and mailing list at sites.google.com/site/plustcs/rss-feeds
| Organizers: | Clément Canonne*, Anindya De, Sumegha Garg, Gautam Kamath, Ilya Razenshteyn, Oded Regev, Tselil Schramm, Thomas Vidick, Erik Waingarten |
| *contact for this listing |
