BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Paata Ivanisvili (UC Irvine)
DTSTART:20211112T230000Z
DTEND:20211113T000000Z
DTSTAMP:20260423T004700Z
UID:ags/9
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/ags/9/">Lear
 ning low degree functions in logarithmic number of random queries.</a>\nby
  Paata Ivanisvili (UC Irvine) as part of Analysis and Geometry Seminar\n\n
 \nAbstract\nPerhaps a very basic question one asks in learning theory is a
 s follows: we have an unknown function $f$ on the hypercube $\\{-1\,1\\}^n
 $\, and we are allowed to query samples $(X\, f(X))$ where $X$ is uniforml
 y distributed on $\\{-1\,1\\}^n$. After getting these samples $(X_1\, f(X_
 1))\, ...\, (X_N\, f(X_N))$ we would like to construct a function $h$ whic
 h approximates f up to an error epsilon (say in $L^2$). Of course $h$ is a
  random function as it involves i.i.d. random variables $X_1\, ... \, X_N$
  in its construction. Therefore\, we want to construct such $h$ which appr
 oximates $f$ with probability at least ($1-\\delta$). So given parameters 
 epsilon\, $\\delta$  in $(0\,1)$ the goal is to minimize the number of ran
 dom queries $N$. I will show that around $\\log(n)$ random queries are suf
 ficient to learn bounded "low-complexity" functions. Based on joint work w
 ith Alexandros Eskenazis.\n
LOCATION:https://researchseminars.org/talk/ags/9/
END:VEVENT
END:VCALENDAR
