BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Alexey Pozdnyakov (University of Connecticut)
DTSTART;VALUE=DATE-TIME:20220825T120000Z
DTEND;VALUE=DATE-TIME:20220825T130000Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/1
DESCRIPTION:Title: Murmurations of elliptic curves\nby Alexey Pozdnyakov (University o
f Connecticut) as part of DANGER2: Data\, Numbers\, and Geometry\n\n\nAbst
ract\nThis talk will review some data scientific experiments involving ari
thmetic objects\, focusing mainly on averaging the Dirichlet coefficients
associated to elliptic curves within fixed conductor ranges. In particular
\, a surprising oscillation that appears in these averages will be discuss
ed. This talk is based on work with He\, Lee\, and Oliver.\n
LOCATION:https://researchseminars.org/talk/Danger2022/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Margaret Regan (Duke University)
DTSTART;VALUE=DATE-TIME:20220825T131500Z
DTEND;VALUE=DATE-TIME:20220825T141500Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/2
DESCRIPTION:Title: Using data as an input to parameterized polynomial systems\nby Marg
aret Regan (Duke University) as part of DANGER2: Data\, Numbers\, and Geom
etry\n\n\nAbstract\nParameterized systems of polynomial equations arise in
many applications including computer vision\, chemistry\, and kinematics.
Numerical homotopy continuation methods are a fundamental technique with
in numerical algebraic geometry for both solving these polynomial systems
and determining more refined information about their structure. Imperativ
e to these solving methods is the use of data — either synthetic or from
the application itself\, such as image pixel data for computer vision and
leg length parameters for kinematics. This talk will highlight various u
ses of data within computer vision and machine learning applications.\n
LOCATION:https://researchseminars.org/talk/Danger2022/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Henry Kvinge (Pacific Northwest National Lab)
DTSTART;VALUE=DATE-TIME:20220825T150000Z
DTEND;VALUE=DATE-TIME:20220825T160000Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/3
DESCRIPTION:Title: How Deep Learning is Being Made More Robust\, More Domain-Aware\, and M
ore Capable of Generalization Through the Influence of Algebra and Topolog
y\nby Henry Kvinge (Pacific Northwest National Lab) as part of DANGER2
: Data\, Numbers\, and Geometry\n\n\nAbstract\nDriven by enormous amounts
of data and compute\, deep learning-based models continue to surpass yeste
rday’s benchmarks. In this fast-growing field where machine learning (ML
) is applied in more and more domains\, there is a constant need for new w
ays of looking at problems. Recent years have seen the rise of tools deriv
ed from topology and algebra\, which are not traditionally associated with
ML. In this talk I will begin by surveying some of the recent application
s of these fields in ML\, from hardcoding equivariance into vision models\
, to using sheaves to better enable learning on graphs. I will argue that
pure mathematics will increasingly offer critical tools necessary for a mo
re mature approach to machine learning. I will end by discussing some of m
y team’s recent work which\, inspired by the notion of a fiber bundle\,
developed a novel deep learning architecture to solve a challenging proble
m in materials science.\n
LOCATION:https://researchseminars.org/talk/Danger2022/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jesus De Loera (University of California\, Davis)
DTSTART;VALUE=DATE-TIME:20220825T161500Z
DTEND;VALUE=DATE-TIME:20220825T171500Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/4
DESCRIPTION:Title: On the Discrete Geometric Principles of Machine Learning and Statistica
l Inference\nby Jesus De Loera (University of California\, Davis) as p
art of DANGER2: Data\, Numbers\, and Geometry\n\n\nAbstract\nIn this talk
I explain the fertile relationship between inference and learning to combi
natorial geometry. My presentation contains several powerful situations wh
ere famous theorems in discrete geometry answered natural questions from
machine learning and statistical inference: In this tasting tour I will in
clude the problem of deciding the existence of Maximum likelihood estimato
r in multiclass logistic regression\, the variability of behavior of k-mea
ns algorithms with distinct random initializations and the shapes of the c
lusters\, and the estimation of the number of samples in chance-constraine
d optimization models. These obviously only scratch the surface of what on
e could do with extra free time. Along the way we will see fascinating con
nections to the coupon collector problem\, topological data analysis\, mea
sures of separability of data\, and to the computation of Tukey centerpoin
ts of data clouds (a high-dimensional generalization of median). All new t
heorems are joint work with subsets of the following wonderful folks: T. H
ogan\, D. Oliveros\, E. Jaramillo-Rodriguez\, and A. Torres-Hernandez.\n
LOCATION:https://researchseminars.org/talk/Danger2022/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Geordie Williamson (University of Sydney)
DTSTART;VALUE=DATE-TIME:20220826T090000Z
DTEND;VALUE=DATE-TIME:20220826T100000Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/5
DESCRIPTION:Title: Equivariant deep learning: a hammer looking for a nail\nby Geordie
Williamson (University of Sydney) as part of DANGER2: Data\, Numbers\, and
Geometry\n\n\nAbstract\nOften one wants to learn quantities which are inv
ariant or equivariant with respect to a group. For example\, the decision
as to whether there is a tiger nearby should not depend on the precise pos
ition of your head and thus this decision should be rotation invariant. An
other example: quantities that appear in the analysis of point clouds ofte
n do not depend on the labelling of the points\, and are therefore invaria
nt under a large symmetric group. I will explain how to build networks whi
ch are equivariant with respect to a group action. What ensues is a fasci
nating interplay between group theory\, representation theory and deep lea
rning. Examples based on translations or rotations recover familiar convol
utional neural nets\, however the theory gives a blueprint for learning in
the presence of complicated symmetry. These architectures appear very use
ful to mathematicians\, but I am not aware of any major applications in ma
thematics as yet. Thus the nail of the title. Most of this talk will be a
review of ideas and techniques well-known in to the geometric deep learnin
g community. New material is joint work with Joel Gibson (Sydney) and Seba
stien Racaniere (DeepMind).\n
LOCATION:https://researchseminars.org/talk/Danger2022/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Anthea Monod (Imperial College London)
DTSTART;VALUE=DATE-TIME:20220826T101500Z
DTEND;VALUE=DATE-TIME:20220826T111500Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/6
DESCRIPTION:Title: Approximating Persistent Homology for Large Datasets\nby Anthea Mon
od (Imperial College London) as part of DANGER2: Data\, Numbers\, and Geom
etry\n\n\nAbstract\nPersistent homology is an important methodology from t
opological data analysis which adapts theory from algebraic topology to da
ta settings and has been successfully implemented in many applications\; i
t produces a summary in the form of a persistence diagram\, which captures
the shape and size of the data. Despite its popularity\, persistent homol
ogy is simply impossible to compute for very large datasets which prohibit
s its widespread use in many big data settings. What can we do if we would
like a representative persistence diagram for a very large dataset whose
persistent homology we cannot compute due to size restrictions? We adapt h
ere the classical statistical method of bootstrapping\, namely\, drawing a
nd studying smaller subsamples from the large dataset. We show that the me
an of the persistence diagrams of subsamples is a valid approximation of t
he persistence diagram of the large dataset and derive its convergence rat
e to the true persistent homology of the large dataset. We demonstrate our
approach on synthetic and real data\; furthermore\, we give an example of
the utility of our approach in a shape clustering problem where we are ab
le to obtain accurate results with only 2% subsampled from the original da
taset. This is joint work with Yueqi Cao (Imperial College London).\n
LOCATION:https://researchseminars.org/talk/Danger2022/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Michael R Douglas (Simons Center for Geometry and Physics)
DTSTART;VALUE=DATE-TIME:20220826T130000Z
DTEND;VALUE=DATE-TIME:20220826T140000Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/7
DESCRIPTION:Title: Numerical methods for Calabi-Yau and special geometry metrics\nby M
ichael R Douglas (Simons Center for Geometry and Physics) as part of DANGE
R2: Data\, Numbers\, and Geometry\n\n\nAbstract\nWe discuss recent work in
which machine learning techniques and software have been adapted to compu
te Ricci flat metrics on Calabi-Yau threefolds\, and ongoing work to compu
te G2 metrics and other geometric structures. Based on work with Rodrigo
Barbosa\, Yidi Qi and Subramananian Lakshinarasimhan.\n
LOCATION:https://researchseminars.org/talk/Danger2022/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Maria Cameron (University of Maryland)
DTSTART;VALUE=DATE-TIME:20220826T141500Z
DTEND;VALUE=DATE-TIME:20220826T151500Z
DTSTAMP;VALUE=DATE-TIME:20220816T034806Z
UID:Danger2022/8
DESCRIPTION:Title: Computing Committors in Collective Variables via Mahalanobis Diffusion
Maps\nby Maria Cameron (University of Maryland) as part of DANGER2: Da
ta\, Numbers\, and Geometry\n\n\nAbstract\nMany interesting problems conce
rned with rare event quantification arise in chemical physics. A typical p
roblem is of finding reaction channels and transition rates for conformal
changes in a biomolecule. To reduce the dimensionality and make the descri
ption of transition processes more comprehensible\, often a set of physica
lly motivated collective variables (dihedral angles\, distances between pa
rticular pairs of key atoms\, etc.) is introduced by means of mapping atom
ic coordinates to a low-dimensional space and averaging. The dynamics in c
ollective variables remain time-reversible but acquire an anisotropic and
position-dependent diffusion tensor. In this talk\, I will discuss how one
can adapt the diffusion map algorithm with the Mahalanobis kernel to appr
oximate the generator of this diffusion process and use it to compute the
committor function\, the reactive current\, and the transition rate. Appli
cations to alanine-dipeptide and Lennard-Jones-7 in 2D will be presented.\
n
LOCATION:https://researchseminars.org/talk/Danger2022/8/
END:VEVENT
END:VCALENDAR