Learning low degree functions in logarithmic number of random queries.

Paata Ivanisvili (UC Irvine)

12-Nov-2021, 23:00-00:00 (4 years ago)

Abstract: Perhaps a very basic question one asks in learning theory is as follows: we have an unknown function $f$ on the hypercube $\{-1,1\}^n$, and we are allowed to query samples $(X, f(X))$ where $X$ is uniformly distributed on $\{-1,1\}^n$. After getting these samples $(X_1, f(X_1)), ..., (X_N, f(X_N))$ we would like to construct a function $h$ which approximates f up to an error epsilon (say in $L^2$). Of course $h$ is a random function as it involves i.i.d. random variables $X_1, ... , X_N$ in its construction. Therefore, we want to construct such $h$ which approximates $f$ with probability at least ($1-\delta$). So given parameters epsilon, $\delta$ in $(0,1)$ the goal is to minimize the number of random queries $N$. I will show that around $\log(n)$ random queries are sufficient to learn bounded "low-complexity" functions. Based on joint work with Alexandros Eskenazis.

analysis of PDEsclassical analysis and ODEscomplex variablesdifferential geometrydynamical systemsfunctional analysismetric geometry

Audience: researchers in the topic


Analysis and Geometry Seminar

Organizers: Polona Durcik*, Mario Stipčić, Mihaela Vajiac
*contact for this listing

Export talk to