BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Isay Katsman (Yale)
DTSTART;VALUE=DATE-TIME:20221024T121500Z
DTEND;VALUE=DATE-TIME:20221024T131500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/1
DESCRIPTION:Title: Rie
mannian Geometry in Machine Learning\nby Isay Katsman (Yale) as part o
f Mathematics and Machine Learning\n\n\nAbstract\nAlthough machine learnin
g researchers have introduced a plethora of useful constructions for learn
ing over Euclidean space\, numerous types of data in various applications
benefit from\, if not necessitate\, a non-Euclidean treatment. In this tal
k I cover the need for Riemannian geometric constructs to (1) build more p
rincipled generalizations of common Euclidean operations used in geometric
machine learning models as well as to (2) enable general manifold density
learning in contexts that require it. Said contexts include theoretical p
hysics\, robotics\, and computational biology. I will cover one of my pape
rs that fits into (1) above\, namely the ICML 2020 paper “Differentiatin
g through the Fréchet Mean.” I will also cover two of my papers that fi
t into (2) above\, namely the NeurIPS 2020 paper “Neural Manifold ODEs
” and the NeurIPS 2021 paper “Equivariant Manifold Flows.” Finally\,
I will briefly discuss directions of relevant ongoing work.\n
LOCATION:https://researchseminars.org/talk/MaML/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sebastian Fischer (LMU Munich)
DTSTART;VALUE=DATE-TIME:20221107T131500Z
DTEND;VALUE=DATE-TIME:20221107T141500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/2
DESCRIPTION:Title: Ben
chmarking Machine Learning Methods Using OpenML and mlr3\nby Sebastian
Fischer (LMU Munich) as part of Mathematics and Machine Learning\n\n\nAbs
tract\nBenchmark studies are an integral part of machine learning research
. The two main components are the datasets that are used to compare the me
thods and the software that supports the researcher in carrying out the ex
periment. OpenML is a platform for sharing datasets and machine learning r
esults and is a great tool to obtain datasets.\nmlr3 is an ecosystem of ma
chine learning packages in the R language\, which among other things allow
s for benchmarking algorithms with ease. This presentation will show how O
penML and mlr3 can be used together to make benchmarking machine learning
methods as easy as possible\, by using the interface R package mlr3oml.\n
LOCATION:https://researchseminars.org/talk/MaML/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Edward de Brouwer (KU Leuven)
DTSTART;VALUE=DATE-TIME:20221219T131500Z
DTEND;VALUE=DATE-TIME:20221219T141500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/3
DESCRIPTION:Title: Top
ological Graph Neural Networks\nby Edward de Brouwer (KU Leuven) as pa
rt of Mathematics and Machine Learning\n\n\nAbstract\nGraph neural network
s (GNNs) are a powerful architecture for tackling graph learning tasks\, y
et have been shown to be oblivious to eminent substructures such as cycles
. In this talk\, we introduce TOGL\, a novel layer that incorporates globa
l topological information of a graph using persistent homology. TOGL can b
e easily integrated into any type of GNN and is strictly more expressive (
in terms the Weisfeiler–Lehman graph isomorphism test) than message-pass
ing GNNs. Augmenting GNNs with TOGL leads to improved predictive performan
ce for graph and node classification tasks\, both on synthetic data sets\,
which can be classified by humans using their topology but not by ordinar
y GNNs\, and on real-world data.\n
LOCATION:https://researchseminars.org/talk/MaML/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jim Halverson (Northeastern)
DTSTART;VALUE=DATE-TIME:20221121T131500Z
DTEND;VALUE=DATE-TIME:20221121T141500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/4
DESCRIPTION:Title: Mac
hine Learning for Pure Math\nby Jim Halverson (Northeastern) as part o
f Mathematics and Machine Learning\n\n\nAbstract\nProgress in machine lear
ning (ML) is poised to revolutionize a variety of STEM fields. But how cou
ld these techniques — which are often stochastic\, error-prone\, and bla
ckbox — lead to progress in pure mathematics\, which values rigor and un
derstanding? I will exemplify how ML can be used to generate conjectures i
n a Calabi-Yau singularity problem that is relevant for physics\, and will
demonstrate how reinforcement learning can yield truth certificates that
rigorously demonstrate properties of knots. The second half of the talk wi
ll utilize ML theory instead of applied ML. Specifically\, I will develop
a neural tangent kernel theory appropriate for flows in the space of metri
cs (realized as neural networks)\, and will realize Perelman’s formulati
on of Ricci flow as a specialization of the general theory.\n
LOCATION:https://researchseminars.org/talk/MaML/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alex Davies (Deep Mind)
DTSTART;VALUE=DATE-TIME:20221212T120000Z
DTEND;VALUE=DATE-TIME:20221212T130000Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/5
DESCRIPTION:Title: Mac
hine Learning with Mathematicians\nby Alex Davies (Deep Mind) as part
of Mathematics and Machine Learning\n\n\nAbstract\nCan machine learning be
a useful tool for research mathematicians? There are many examples of mat
hematicians pioneering new technologies to aid our understanding of the ma
thematical world: using very early computers to help formulate the Birch a
nd Swinnerton-Dyer conjecture and using computer aid to prove the four col
our theorem are among the most notable. Up until now\, there hasn’t been
significant use of machine learning in the field and it hasn’t been cle
ar where it might be useful for the questions that mathematicians care abo
ut. In this talk\, we will discuss how working together with top mathemati
cians to use machine learning to achieve two new results – proving a new
connection between the hyperbolic and geometric structure of knots\, and
conjecturing a resolution to a 50-year problem in representation theory\,
the combinatorial invariance conjecture. Through these examples\, we demon
strate a way that machine learning can be used by mathematicians to help g
uide the development of surprising and beautiful new conjectures.\n
LOCATION:https://researchseminars.org/talk/MaML/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Noemi Montobbio (IIT)
DTSTART;VALUE=DATE-TIME:20230109T131500Z
DTEND;VALUE=DATE-TIME:20230109T141500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/6
DESCRIPTION:Title: Eme
rgence of Lie Symmetries in Functional Architectures Learned by CNNs\n
by Noemi Montobbio (IIT) as part of Mathematics and Machine Learning\n\n\n
Abstract\nConvolutional Neural Networks (CNNs) are a powerful tool providi
ng outstanding performances on image classification tasks\, based on an ar
chitecture designed in analogy with information processing in biological v
isual systems. The functional architectures of the early visual pathways h
ave often been described in terms of geometric invariances\, and several s
tudies have leveraged this framework to investigate the analogies between
CNN models and biological mechanisms. Remarkably\, upon learning on natura
l images\, the translation-invariant filters of the first layer of a CNN h
ave been shown to develop as approximate Gabor functions\, resembling the
orientation-selective receptive profiles found in the primary visual corte
x (V1). With a similar approach\, we modified a standard CNN architecture
to insert computational blocks compatible with specific biological process
ing stages\, and studied the spontaneous development of approximate geomet
ric invariances after training the network on natural images. In particula
r\, inserting a pre-filtering step mimicking the Lateral Geniculate Nucleu
s (LGN) led to the emergence of a radially symmetric profile well approxim
ated by a Laplacian of Gaussian\, which is a well-known model of receptive
profiles of LGN cells. Moreover\, we introduced a lateral connectivity ke
rnel acting on the feature space of the first network layer. We then studi
ed the learned connectivity as a function of relative tuning of first-laye
r filters\, thus re-mapping it into the roto-translation space. This analy
sis revealed orientation-specific patterns\, which we compared qualitative
ly and quantitatively with established group-based models of V1 horizontal
connectivity.\n
LOCATION:https://researchseminars.org/talk/MaML/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Anna Seigal (Harvard)
DTSTART;VALUE=DATE-TIME:20230123T141500Z
DTEND;VALUE=DATE-TIME:20230123T151500Z
DTSTAMP;VALUE=DATE-TIME:20230925T220603Z
UID:MaML/7
DESCRIPTION:by Anna Seigal (Harvard) as part of Mathematics and Machine Le
arning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/MaML/7/
END:VEVENT
END:VCALENDAR