BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Fabian Ruehle (Northeastern University)
DTSTART:20240517T070000Z
DTEND:20240517T080000Z
DTSTAMP:20260423T021020Z
UID:UNISTMath/2
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/UNISTMath/2/
 ">Kolmogorov-Arnold Networks</a>\nby Fabian Ruehle (Northeastern Universit
 y) as part of UNIST Mathematical Sciences Seminar Series\n\n\nAbstract\nWe
  introduce Kolmogorov-Arnold Networks (KANs) as an alternative to standard
  feed-forward neural networks. KANs are based on Kolmogorov-Arnold represe
 ntation theory\, which means for our purposes that we can represent any fu
 nction we want to learn by a weighted sum over basis functions\, taken to 
 be splines. In contrast to standard MLPs\, the function basis of KANs is f
 ixed to be piecewise polynomial rather than a combination of weights and n
 on-linearities\, and we only learn the parameters that control the individ
 ual splines. While this is more expensive than a standard MLP\, KANs have 
 two properties that can offset this cost. First\, KANs can typically work 
 with much fewer parameters. Second\, they exhibit better neural scaling la
 ws\, meaning the error decreases faster when increasing the number of para
 meters as compared to MLPs. Fewer parameters also mean that KANs are much 
 more interpretable\, especially when combined with the sparsification and 
 pruning techniques we introduce. This makes KANs interesting as tools for 
 symbolic regression and for scientific discovery. We discuss an example fr
 om knot theory\, where we could recover (trivial and non-trivial) relation
 s among knot invariants.\n\nRegister for zoom link:\nhttps://us06web.zoom.
 us/meeting/register/tZ MkcOqtrDMuGNAQcMlvp3-MJwcWXVU6fzXl\n
LOCATION:https://researchseminars.org/talk/UNISTMath/2/
END:VEVENT
END:VCALENDAR
