BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Shalev Ben-David (U Waterloo)
DTSTART:20201104T180000Z
DTEND:20201104T190000Z
DTSTAMP:20260423T021011Z
UID:TCSPlus/16
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/TCSPlus/16/"
 >Forecasting Algorithms\, Minimax Theorems\, and Randomized Lower Bounds</
 a>\nby Shalev Ben-David (U Waterloo) as part of TCS+\n\n\nAbstract\nI will
  present a new approach to randomized lower bounds\, particularly in the s
 etting where we wish to give a fine-grained analysis of randomized algorit
 hms that achieve small bias. The approach is as follows: instead of consid
 ering ordinary randomized algorithms which give an output in {0\,1} and ma
 y err\, we switch models to look at "forecasting" randomized algorithms wh
 ich output a confidence in [0\,1] for whether they think the answer is 1. 
 When scored by a proper scoring rule\, the performance of the best forecas
 ting algorithm is closely related to the bias of the best (ordinary) rando
 mized algorithm\, but is more amenable to analysis.\n\nAs an application\,
  I'll present a new minimax theorem for randomized algorithms\, which can 
 be viewed as a strengthening of Yao's minimax theorem. Yao's minimax theor
 em guarantees the existence of a hard distribution for a function f such t
 hat solving f against this distribution (to a desired error level) is as h
 ard as solving f in the worst case (to that same error level). However\, t
 he hard distribution provided by Yao's theorem depends on the chosen error
  level. Our minimax theorem removes this dependence\, giving a distributio
 n which certifies the hardness of f against all bias levels at once. In re
 cent work\, we used this minimax theorem to give a tight composition theor
 em for randomized query complexity.\n\nBased on joint work with Eric Blais
 .\n
LOCATION:https://researchseminars.org/talk/TCSPlus/16/
END:VEVENT
END:VCALENDAR
