BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Jeffrey Negrea (University of Toronto)
DTSTART:20200714T163000Z
DTEND:20200714T174500Z
DTSTAMP:20260423T020953Z
UID:IASML/13
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/IASML/13/">R
 elaxing the I.I.D. assumption: Adaptive mnimax optimal sequential predicti
 on with expert advice</a>\nby Jeffrey Negrea (University of Toronto) as pa
 rt of IAS Seminar Series on Theoretical Machine Learning\n\n\nAbstract\nWe
  consider sequential prediction with expert advice when the data are gener
 ated stochastically\, but the distributions generating the data may vary a
 rbitrarily among some constraint set. We quantify relaxations of the class
 ical I.I.D. assumption in terms of possible constraint sets\, with I.I.D. 
 at one extreme\, and an adversarial mechanism at the other. The Hedge algo
 rithm\, long known to be minimax optimal for in the adversarial regime\, h
 as recently been shown to also be minimax optimal in the I.I.D. setting. W
 e show that Hedge is sub-optimal between these extremes\, and present a ne
 w algorithm that is adaptively minimax optimal with respect to our relaxat
 ions of the I.I.D. assumption\, without knowledge of which setting prevail
 s.\n
LOCATION:https://researchseminars.org/talk/IASML/13/
END:VEVENT
END:VCALENDAR
