BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Oskar Allerbo (KTH Royal Institute of Technology)
DTSTART:20251008T111500Z
DTEND:20251008T120000Z
DTSTAMP:20260422T155025Z
UID:gbgstats/93
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/gbgstats/93/
 ">Is supervised learning really that different from unsupervised?</a>\nby 
 Oskar Allerbo (KTH Royal Institute of Technology) as part of Gothenburg st
 atistics seminar\n\nLecture held in MVL14.\n\nAbstract\nWe demonstrate how
  supervised learning can be decomposed into a two-stage procedure\, where 
 (1) all model parameters are selected in an unsupervised manner\, and (2) 
 the outputs y are added to the model\, without changing the parameter valu
 es. This is achieved by a new model selection criterion that - in contrast
  to cross-validation - can be used also without access to y. For linear ri
 dge regression\, we bound the asymptotic out-of-sample risk of our method 
 in terms of the optimal asymptotic risk. We also demonstrate on real and s
 ynthetic data that versions of linear and kernel ridge regression\, smooth
 ing splines\, and neural networks\, which are trained without access to y\
 , perform similarly to their standard y-based counterparts. Hence\, our re
 sults suggest that the difference between supervised and unsupervised lear
 ning is less fundamental than it may appear.\nJoint work with Thomas B. Sc
 hön.\n
LOCATION:https://researchseminars.org/talk/gbgstats/93/
END:VEVENT
END:VCALENDAR
