BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Jeff Calder (University of Minnesota)
DTSTART:20210525T141500Z
DTEND:20210525T154500Z
DTSTAMP:20260423T022602Z
UID:MathDeep/6
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MathDeep/6/"
 >Random walks and PDEs in graph-based learning</a>\nby Jeff Calder (Univer
 sity of Minnesota) as part of Mathematics of Deep Learning\n\n\nAbstract\n
 I will discuss some applications of random walks and PDEs in graph-based l
 earning\, both for theoretical analysis and algorithm development. Graph-b
 ased learning is a field within machine learning that uses similarities be
 tween datapoints to create efficient representations of high-dimensional d
 ata for tasks like semi-supervised classification\, clustering and dimensi
 on reduction. There has been considerable interest recently in semi-superv
 ised learning problems with very few labeled examples (e.g.\, 1 label per 
 class). The widely used Laplacian regularization is ill-posed at low label
  rates and gives very poor classification results. In the first part of th
 e talk\, we will use the random walk interpretation of the graph Laplacian
  to precisely characterize the lowest label rate at which Laplacian regula
 rized semi-supervised learning is well-posed. At lower label rates\, where
  Laplace learning performs poorly\, we will show how our random walk analy
 sis leads to a new algorithm\, called Poisson learning\, that is probably 
 more stable and informative than Laplace learning. We will conclude with s
 ome applications of Poisson learning to image classification and mesh segm
 entation of broken bone fragments of interest in anthropology.\n
LOCATION:https://researchseminars.org/talk/MathDeep/6/
END:VEVENT
END:VCALENDAR
