BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Shing-Tung Yau (Harvard University\, Tsinghua University and BIMSA
 )
DTSTART:20210126T140000Z
DTEND:20210126T141000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/1
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /1/">Introduction and Welcome</a>\nby Shing-Tung Yau (Harvard University\,
  Tsinghua University and BIMSA) as part of Sanya Workshop on Algebraic Geo
 metry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Giuseppa Pitton (Imperial College London)
DTSTART:20210126T150000Z
DTEND:20210126T153000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/2
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /2/">Computation\, Data Analysis\, and Statistical Inference for classes o
 f Maximally-Mutable Laurent Polynomials</a>\nby Giuseppa Pitton (Imperial 
 College London) as part of Sanya Workshop on Algebraic Geometry and Machin
 e Learning\n\n\nAbstract\nOver the last 10 years\, the collective efforts 
 of pure mathematicians involved in the Fanosearch project [1\, 2] have led
  to remarkable          advances in the understanding of Fano classificati
 on using ideas from Mirror Symmetry.\nA fundamental part of the Fanosearch
  project is the computation of some Laurent polynomials\, called Maximally
 -Mutable Laurent Polynomials\, that are naturally associated to a class of
  lattice polytopes.\nDespite the tremendous computational challenges invol
 ved\, we can construct mirror polynomials for a large sets of lattice poly
 topes.\nIn this talk I will describe some recent computational results for
  the specific case of canonical Fano 3-topes.\nI will discuss in particula
 r how Data Analysis helps us explore the wealth of information that result
 s from our computations\, and how we take advantage of algorithms from Sta
 tistical Inference to test known and conjectured results.\nThis is a joint
  work (in progress) with Tom Coates and Alexander Kasprzyk.\n[1] doi:10.41
 71/120-1/16\n[2] doi:10.1090/proc/12876\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Wenrui Hao (Penn State University)
DTSTART:20210126T163000Z
DTEND:20210126T170000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/3
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /3/">Homotopy training algorithm for neural networks and applications in p
 attern formation</a>\nby Wenrui Hao (Penn State University) as part of San
 ya Workshop on Algebraic Geometry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Joe Kileel (University of Texas at Austin)
DTSTART:20210126T171000Z
DTEND:20210126T174000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/4
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /4/">Geometry and Optimization of Shallow Polynomial Networks</a>\nby Joe 
 Kileel (University of Texas at Austin) as part of Sanya Workshop on Algebr
 aic Geometry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tingting Tang (San Deigo State University)
DTSTART:20210126T175000Z
DTEND:20210126T182000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/5
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /5/">The Loss Surface Of Deep Linear Networks Viewed Through The Algebraic
  Geometry Lens</a>\nby Tingting Tang (San Deigo State University) as part 
 of Sanya Workshop on Algebraic Geometry and Machine Learning\n\nAbstract: 
 TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Bernard Mourrain (Inria)
DTSTART:20210127T140000Z
DTEND:20210127T143000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/6
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /6/">The Geometry of Moments\, Tensor Decomposition\, Machine Learning and
  Applications</a>\nby Bernard Mourrain (Inria) as part of Sanya Workshop o
 n Algebraic Geometry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Kathlén Kohn (KTH Royal Institute of Technology)
DTSTART:20210127T144000Z
DTEND:20210127T151000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/7
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /7/">The Geometry of Neural Networks</a>\nby Kathlén Kohn (KTH Royal Inst
 itute of Technology) as part of Sanya Workshop on Algebraic Geometry and M
 achine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Zehua Lai (University of Chicago)
DTSTART:20210127T163000Z
DTEND:20210127T170000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/8
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /8/">Noncommutative Arithmetic-Geometric Mean Conjecture is False</a>\nby 
 Zehua Lai (University of Chicago) as part of Sanya Workshop on Algebraic G
 eometry and Machine Learning\n\n\nAbstract\nStochastic optimization algori
 thms have become indispensable in modern machine learning. An important qu
 estion in this area is the difference between with-replacement sampling an
 d without-replacement sampling --- does the latter have superior convergen
 ce rate compared to the former? A paper of Recht and Re reduces the proble
 m to a noncommutative analogue of the arithmetic-geometric mean inequality
  where n positive numbers are replaced by n positive definite matrices. If
  this inequality holds for all n\, then without-replacement sampling (also
  known as random reshuffling) indeed outperforms with-replacement sampling
  in some important optimization problems. In this talk\, We will explain b
 asic ideas and techniques in polynomial optimization and the theory of non
 commutative Positivstellensatz\, which allows us to reduce the conjectured
  inequality to a semidefinite program and the validity of the conjecture t
 o certain bounds for the optimum values. Finally\, we show that Recht--Re 
 conjecture is false as soon as $n = 5$. This is a joint work with Lek-Heng
  Lim.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/8/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Margaret Regan (Duke University)
DTSTART:20210127T171000Z
DTEND:20210127T174000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/9
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /9/">Using machine learning to determine the real discriminant locus</a>\n
 by Margaret Regan (Duke University) as part of Sanya Workshop on Algebraic
  Geometry and Machine Learning\n\n\nAbstract\nParameterized systems of pol
 ynomial equations arise in many applications in science and engineering wi
 th the real solutions describing\, for example\, equilibria of a dynamical
  system\, linkages satisfying design constraints\, and scene reconstructio
 n in computer vision. Since different parameter values can have a differen
 t number of real solutions\, the parameter space is decomposed into region
 s whose boundary forms the real discriminant locus.  In this talk\, I will
  discuss a novel sampling method for multidimensional parameter spaces and
  how it is used in various machine learning algorithms to locate the real 
 discriminant locus as a supervised classification problem\, where the clas
 ses are the number of real solutions.  Examples such as the Kuramoto model
  will be used to show the efficacy of the methods.  Finally\, an applicati
 on to real parameter homotopy methods will be presented. This project is j
 oint work with Edgar Bernal\, Jonathan Hauenstein\, Dhagash Mehta\, and Ti
 ngting Tang.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/9/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Michael Kirby (Colorado State University)
DTSTART:20210127T175000Z
DTEND:20210127T182000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/10
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /10/">The Grassmann and Flag manifolds for Analyzing Big Data</a>\nby Mich
 ael Kirby (Colorado State University) as part of Sanya Workshop on Algebra
 ic Geometry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/10/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Challenger Mishra (University of Cambridge)
DTSTART:20210128T140000Z
DTEND:20210128T143000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/11
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /11/">Neural Network Approximations for Calabi-Yau Metrics</a>\nby Challen
 ger Mishra (University of Cambridge) as part of Sanya Workshop on Algebrai
 c Geometry and Machine Learning\n\n\nAbstract\nRicci flat metrics for Cala
 bi-Yau threefolds are not known analytically. In this work\, we employ tec
 hniques from machine learning to deduce numerical flat metrics for the Fer
 mat quintic\, for the Dwork quintic\, and for the Tian-Yau manifold. This 
 investigation employs a single neural network architecture that is capable
  of approximating Ricci flat Kaehler metrics for several Calabi-Yau manifo
 lds of dimensions two and three. We show that measures that assess the Ric
 ci flatness of the geometry decrease after training by three orders of mag
 nitude. This is corroborated on the validation set\, where the improvement
  is more modest. Finally\, we demonstrate that discrete symmetries of mani
 folds can be learned in the process of learning the metric.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/11/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sven Krippendorf (LMU)
DTSTART:20210128T144000Z
DTEND:20210128T151000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/12
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /12/">Learning Symmetries and Conserved Quantities of Physical Systems</a>
 \nby Sven Krippendorf (LMU) as part of Sanya Workshop on Algebraic Geometr
 y and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/12/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Shailesh Lal (Universidade do Porto)
DTSTART:20210128T152000Z
DTEND:20210128T155000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/13
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /13/">Machine Learning Etudes For Symmetries</a>\nby Shailesh Lal (Univers
 idade do Porto) as part of Sanya Workshop on Algebraic Geometry and Machin
 e Learning\n\n\nAbstract\nWe demonstrate how modest feed forward neural ne
 ts learn symmetry in datasets\, with special focus on conformal symmetry. 
 We also show how aspects of Lie algebra representation theory computations
  are machine learnable. The talk will be based on 2006.16114 and 2011.0087
 1.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/13/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Fabian Ruehle (CERN)
DTSTART:20210128T163000Z
DTEND:20210128T170000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/14
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /14/">Moduli-dependent Calabi-Yau and SU(3)-structure metrics from Machine
  Learning</a>\nby Fabian Ruehle (CERN) as part of Sanya Workshop on Algebr
 aic Geometry and Machine Learning\n\n\nAbstract\nCalabi-Yau manifolds play
  a crucial role in string compactifications. Yau's theorem guarantees the 
 existence of a metric that satisfies the string's equation of motion. Howe
 ver\, Yau's proof is non-constructive\, and no analytic expressions for me
 trics on Calabi-Yau threefolds are known. We use neural networks to learn 
 Calabi-Yau metrics and their complex structure moduli dependence. After a 
 short introduction to CY manifolds\, I will illustrate how we train neural
  networks to find Calabi-Yau metrics by using the underlying partial diffe
 rential equations as loss functions. The approach generalizes to more gene
 ral manifolds and can hence also be used for manifolds with reduced struct
 ure\, such as SU(3) structure or G2 manifolds\, which feature in string co
 mpactifications with flux and in the M-theory formulation of string theory
 \, respectively. I will illustrate this generalization for a particular SU
 (3) structure metric and compare the machine learning result to the known\
 , analytic expression.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/14/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jim Halverson (Northeastern University)
DTSTART:20210128T171000Z
DTEND:20210128T174000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/15
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /15/">Knots and Natural Language</a>\nby Jim Halverson (Northeastern Unive
 rsity) as part of Sanya Workshop on Algebraic Geometry and Machine Learnin
 g\n\n\nAbstract\nWe introduce natural language processing into the study o
 f knot theory\, as made natural by the braid word representation of knots.
  We study the UNKNOT problem of determining whether or not a given knot is
  the unknot. After describing an algorithm to randomly generate $N$-crossi
 ng braids and their knot closures and discussing the induced prior on the 
 distribution of knots\, we apply binary classification to the UNKNOT decis
 ion problem. We find that the Reformer and shared-QK Transformer network a
 rchitectures outperform fully-connected networks\, though all perform well
 . Perhaps surprisingly\, we find that accuracy increases with the length o
 f the braid word\, and that the networks learn a direct correlation betwee
 n the confidence of their predictions and the degree of the Jones polynomi
 al. Finally\, we utilize reinforcement learning (RL) to find sequences of 
 Markov moves and braid relations that simplify knots and can identify unkn
 ots by explicitly giving the sequence of unknotting actions. Trust region 
 policy optimization (TRPO) performs consistently well for a wide range of 
 crossing numbers and thoroughly outperformed other RL algorithms and rando
 m walkers. Studying these actions\, we find that braid relations are more 
 useful in simplifying to the unknot than one of the Markov moves.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/15/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Bernd Sturmfels (MPI Leipzig)
DTSTART:20210128T175000Z
DTEND:20210128T182000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/16
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /16/">Wasserstein distance to independence models</a>\nby Bernd Sturmfels 
 (MPI Leipzig) as part of Sanya Workshop on Algebraic Geometry and Machine 
 Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/16/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Rak-Kyeong Seong (Samsung)
DTSTART:20210129T120000Z
DTEND:20210129T123000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/17
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /17/">Reinforcement Learning for Optimization Problems</a>\nby Rak-Kyeong 
 Seong (Samsung) as part of Sanya Workshop on Algebraic Geometry and Machin
 e Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/17/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Vishnu Jejjala (University of the Witwatersrand)
DTSTART:20210129T124000Z
DTEND:20210129T131000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/18
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /18/">(K)not  Machine Learning</a>\nby Vishnu Jejjala (University of the W
 itwatersrand) as part of Sanya Workshop on Algebraic Geometry and Machine 
 Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/18/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Andre Lukas (Oxford)
DTSTART:20210129T132000Z
DTEND:20210129T135000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/19
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /19/">String Data and Machine Learning</a>\nby Andre Lukas (Oxford) as par
 t of Sanya Workshop on Algebraic Geometry and Machine Learning\n\nAbstract
 : TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/19/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Closing Remark
DTSTART:20210129T173000Z
DTEND:20210129T174500Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/20
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /20/">Closing Remark</a>\nby Closing Remark as part of Sanya Workshop on A
 lgebraic Geometry and Machine Learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/20/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Cody Long (Harvard)
DTSTART:20210129T161000Z
DTEND:20210129T164000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/21
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /21/">Statistical Predictions in String Theory and Deep Generative Models<
 /a>\nby Cody Long (Harvard) as part of Sanya Workshop on Algebraic Geometr
 y and Machine Learning\n\n\nAbstract\nGenerative models in deep learning a
 llow for sampling probability distributions that approximate data distribu
 tions. I will discuss using generative models for making approximate stati
 stical predictions in the string theory landscape. For vacua admitting a L
 agrangian description this can be thought of as learning random tensor app
 roximations of couplings. As a concrete example I will demonstrate that a 
 large ensemble of metrics on Kähler moduli space of Calabi-Yau threefolds
  are well-approximated by ensembles of matrices produced by a deep convolu
 tional Wasserstein GAN.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/21/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Michael Douglas (CMSA Harvard and Stony Brook University)
DTSTART:20210129T165000Z
DTEND:20210129T172000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/22
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /22/">Numerical Calabi-Yau metrics from holomorphic networks</a>\nby Micha
 el Douglas (CMSA Harvard and Stony Brook University) as part of Sanya Work
 shop on Algebraic Geometry and Machine Learning\n\n\nAbstract\nWe propose 
 machine learning inspired methods for computing numerical Calabi-Yau (Ricc
 i flat Kähler) metrics\, and implement them using Tensorflow/Keras. We co
 mpare them with previous work\, and find that they are far more accurate f
 or manifolds with little or no symmetry. We also discuss issues such as ov
 erparameterization and choice of optimization methods.\nJoint work with Su
 bramanian Lakshminarasimhan and Yidi Qi.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/22/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Paul Breiding (Universität Kassel)
DTSTART:20210126T142000Z
DTEND:20210126T145000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/23
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /23/">Euclidean Distance Degree and Mixed Volume</a>\nby Paul Breiding (Un
 iversität Kassel) as part of Sanya Workshop on Algebraic Geometry and Mac
 hine Learning\n\n\nAbstract\nThe Euclidean Distance Degree (EDD) of an alg
 ebraic variety V counts the number of complex critical points of the dista
 nce function from V to a generic fixed point outside of V.\nThe BKK-Theore
 m (Bernstein \, Kushnirenko\, and Khovanskii) says that the number of comp
 lex zeros of a generic sparse polynomial system is equal to the mixed volu
 me of the Newton polytopes of the polynomials. \nIn this talk I want to di
 scuss that for a generic sparse polynomial f the EDD of the hypersurface f
 =0 equals the mixed volume of the Lagrange multiplier equations for the ED
 D. \nThis has impact on using polynomial homotopy continuation for computi
 ng the ED-critical points. And it provides new formulas for the EDD (Joint
  work with Frank Sottile and James Woodcock).\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/23/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Joseph Landsberg (Texas A&M University)
DTSTART:20210127T155000Z
DTEND:20210127T162000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/24
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /24/">Tensors and algebraic geometry</a>\nby Joseph Landsberg (Texas A&M U
 niversity) as part of Sanya Workshop on Algebraic Geometry and Machine Lea
 rning\n\n\nAbstract\nI will give an overview of the use of algebraic geome
 try in the study of tensors. The algebraic geometry involved is both class
 ical (secant varieties\, vector bundles on projective space) and quasi-mod
 ern (deformation theory\, the Haiman-Sturmfels multi-graded Hilbert scheme
 \, and the Quot scheme). I will also\, as time permits\, explain applicati
 ons to complexity theory and quantum information theory.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/24/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Industry Session
DTSTART:20210129T140000Z
DTEND:20210129T160000Z
DTSTAMP:20260416T124203Z
UID:SanyaAGandML/25
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SanyaAGandML
 /25/">Industry Session</a>\nby Industry Session as part of Sanya Workshop 
 on Algebraic Geometry and Machine Learning\n\n\nAbstract\nWe will use brea
 kout rooms so that the participants can talk to people from industry.\n
LOCATION:https://researchseminars.org/talk/SanyaAGandML/25/
END:VEVENT
END:VCALENDAR
