BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Oskar Allerbo (Chalmers University of Technology & University of G
 othenburg)
DTSTART:20230511T111500Z
DTEND:20230511T120000Z
DTSTAMP:20260422T155153Z
UID:gbgstats/26
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/gbgstats/26/
 ">Solving Kernel Ridge Regression with Gradient Descent</a>\nby Oskar Alle
 rbo (Chalmers University of Technology & University of Gothenburg) as part
  of Gothenburg statistics seminar\n\nLecture held in MVL14.\n\nAbstract\nW
 e present an equivalent formulation for the objective function of kernel r
 idge regression (KRR)\, that opens up for studying KRR from the perspectiv
 e of gradient descent. Utilizing gradient descent with infinitesimal step 
 size\, allows us to formulate a new regularization for kernel regression  
    through early stopping.\n\nThe gradient descent formulation of KRR allo
 ws us expand to a time dependent stationary kernel\, where we decrease the
  bandwidth to zero during training. This circumvents the need of hyper par
 ameter selection. Furthermore\, we     are able to achieve both zero train
 ing error and a double descent behavior\, phenomena that do not occur for 
 KRR with constant bandwidth\, but are known to appear for neural networks.
 \n\nThe new formulation of KRR also enables us to explore other penalties 
 than the ridge penalty. Specifically\, we explore the $\\ell_1$ and $\\ell
 _\\infty$ penalties and show that these correspond to two flavors of gradi
 ent descent\, thus alleviating the need of computationally heavy proximal 
 gradient descent algorithms. We show theoretically and empirically how the
 se formulations correspond to signal-driven and robust regression\, respec
 tively.\n
LOCATION:https://researchseminars.org/talk/gbgstats/26/
END:VEVENT
END:VCALENDAR
