BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Cristhian Garay Lopez (Centro de Investigación en Matemáticas)
DTSTART:20241111T060000Z
DTEND:20241111T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/1
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/1/">Idempotization of schemes and sheaves</a>\nby Cristhian Garay Lo
 pez (Centro de Investigación en Matemáticas) as part of Tropical mathema
 tics and machine learning\n\n\nAbstract\nTropical algebraic geometry tries
  to be the algebraic geometry of the tropical semiring\, which is an examp
 le of idempotent semiring.  \n\nMotivated by several relevant problems in 
 tropical and non-Archimedean algebraic geometry (e.g. the definition of tr
 opical schemes\, or the analytification and the schematic tropicalization 
 of algebraic varieties defined over a valuated field) we present an algebr
 aic process for the “idempotization” of both schemes and sheaves of ri
 ngs and modules over them\, understanding “idempotization” as a proces
 s that associates idempotent algebraic objects to the usual objects of alg
 ebraic geometry and commutative algebra.\n\nThis goal is achieved by first
  constructing the affine scheme case\, and then globalizing it using a fix
 ed affine open cover of a given scheme. The affine case is governed by cer
 tain idempotent semirings and idempotent semi-modules defined over them (t
 hat we call realizable semirings and semimodules\, respectively)\, which t
 urn out to be semi-lattices that can be studied through a version of commu
 tative algebra for idempotent semirings. We show that these objects are su
 itable for our formalism because their lattices of sub-object are (in a pr
 ecise way) a combinatorial reflection of the usual lattices obtained in co
 mmutative algebra.\n\nThis is a joint work with Félix Baril Boudreau (U. 
 of Luxembourg).\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Gabriele Balletti (RaySearch Laboratories)
DTSTART:20241118T063000Z
DTEND:20241118T073000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/2
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/2/">Machine Assisted Proofs and Disproofs in Discrete Geometry</a>\n
 by Gabriele Balletti (RaySearch Laboratories) as part of Tropical mathemat
 ics and machine learning\n\n\nAbstract\nI will discuss how modern computat
 ional techniques can help us get a better understanding of the mathematics
  of geometric structures\, with examples from Ehrhart Theory of lattice po
 lytopes. \nI will present several machine-assisted proofs where computatio
 nal aid has been essential\, and a more recent counterexample - achieved t
 hrough a genetic algorithm - that answers negatively to a question by Ferr
 oni and Higashitani.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Yuto Yamamoto (RIKEN iTHEMS)
DTSTART:20250106T060000Z
DTEND:20250106T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/3
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/3/">Period integrals of hypersurfaces via tropical geometry</a>\nby 
 Yuto Yamamoto (RIKEN iTHEMS) as part of Tropical mathematics and machine l
 earning\n\n\nAbstract\nAbouzaid--Ganatra--Iritani--Sheridan computed asymp
 totics of integrations of holomorphic volume forms on toric Calabi--Yau hy
 persurfaces over Lagrangian sections of SYZ fibrations by using tropical g
 eometry. They gave a new proof of the gamma conjecture for ambient line bu
 ndles on Batyrev pairs of mirror Calabi--Yau hypersurfaces. In the talk\, 
 we review their work and discuss its generalization to the case of toric h
 ypersurfaces which are not necessarily Calabi--Yau hypersurfaces.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Edward Hirst (Queen Mary\, University of London)
DTSTART:20250210T080000Z
DTEND:20250210T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/4
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/4/">Machine Learning Combinatorics from hep-th</a>\nby Edward Hirst 
 (Queen Mary\, University of London) as part of Tropical mathematics and ma
 chine learning\n\n\nAbstract\nAn informal review of works applying supervi
 sed machine learning architectures to combinatoric objects in hep-th will 
 be provided. These objects are related to quiver gauge theories\, and incl
 ude quivers\, Hilbert Series\, Amoebae\, Brane Webs\; discussing their eff
 icient representation and amenability to ML architectures on some simple t
 asks.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Samuel Bernard-Bernardet and Benjamin Apffel (DotWave Lab (S.B.B.)
 \, LWE\, EPFL\, Switzerland (B.A.))
DTSTART:20250224T080000Z
DTEND:20250224T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/5
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/5/">The spinorial ball : a macroscopic object of spin-1/2</a>\nby Sa
 muel Bernard-Bernardet and Benjamin Apffel (DotWave Lab (S.B.B.)\, LWE\, E
 PFL\, Switzerland (B.A.)) as part of Tropical mathematics and machine lear
 ning\n\n\nAbstract\nIn quantum physics lectures\, half-integer spins are g
 enerally introduced as “objects that do not come back to their original 
 state after one full turn but that do after two" and are often believed to
  be purely quantum mechanics behavior. However\, spin-1/2 is above all a g
 eometrical property of the rotations group SO(3) and can\, therefore\, als
 o have practical consequences at the macroscopic scale. To illustrate this
 \, we will describe in this seminar a new visualizing tool named the spino
 rial ball\, that allows to concretely manipulate a macroscopic 1/2-spin. I
 t is based on the existing group homomorphism between SU(2) and SO(3)\, wh
 ich will be discussed extensively during the seminar. We will also show li
 vestream how to visualize the Poincaré-Bloch sphere\, the Hopf bundle or 
 the homotopy classes of SO(3) using the ball. Last\, we will describe some
  key  practical aspects of the implementation.\n\nUseful links :\n\nGitHub
  of the project (open source) : https://github.com/heligone/spinorialBall\
 n\nA second paper : https://arxiv.org/abs/2411.15059\n\nSpinorial ball web
 site: www.spinorialball.com\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Baran Hashemi (cancelled) (Technical University of Munich)
DTSTART:20250217T080000Z
DTEND:20250217T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/6
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/6/">Can Transformers Do Enumerative Geometry?</a>\nby Baran Hashemi 
 (cancelled) (Technical University of Munich) as part of Tropical mathemati
 cs and machine learning\n\n\nAbstract\nWe introduce a Transformer-based ap
 proach to computational enumerative geometry\, specifically targeting the 
 computation of $\\psi$-class intersection numbers on the moduli space of c
 urves. Traditional methods for calculating these numbers suffer from facto
 rial computational complexity\, making them impractical to use. By reformu
 lating the problem as a continuous optimization task\, we compute intersec
 tion numbers across a wide value range from $10^{45}$ to $10^{-45}$. To ca
 pture the recursive nature inherent in these intersection numbers\, we pro
 pose the Dynamic Range Activator (DRA)\, a new activation function that en
 hances the Transformer's ability to model recursive patterns and handle se
 vere heteroscedasticity. Given precision requirements for computing the in
 tersections\, we quantify the uncertainty of the predictions using Conform
 al Prediction with a dynamic sliding window adaptive to the partitions of 
 equivalent number of marked points. To the best of our knowledge\, there h
 as been no prior work on modeling recursive functions with such a high-var
 iance and factorial growth. Beyond simply computing intersection numbers\,
  we explore the enumerative "world-model" of Transformers. Our interpretab
 ility analysis reveals that the network is implicitly modeling the Virasor
 o constraints in a purely data-driven manner. Moreover\, through abductive
  hypothesis testing\, probing\, and causal inference\, we uncover evidence
  of an emergent internal representation of the the large-genus asymptotic 
 of  the $\\psi$-class intersection numbers. These findings suggest that th
 e network internalizes the parameters of the asymptotic closed-form and th
 e polynomiality phenomenon of psi $\\psi$-class intersection numbers in a 
 non-linear manner.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Olexandr Konovalov (University of St Andrews)
DTSTART:20250428T080000Z
DTEND:20250428T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/7
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/7/">Open science and reproducible research</a>\nby Olexandr Konovalo
 v (University of St Andrews) as part of Tropical mathematics and machine l
 earning\n\n\nAbstract\nDr Olexandr Konovalov is a lecturer in the School o
 f Computer Science of the University of St Andrews\,\nwhere he leads the R
 esearch Software Group. He is also one of the developers of the open sourc
 e\nmathematical software system GAP (https://www.gap-system.org <https://w
 ww.gap-system.org/>). He will talk about open science practices in computa
 tional research\, and will present some novel ways of using Jupyter notebo
 oks to share reproducible computational experiments in Python\, R\, GAP\, 
 and other programming languages supported by Jupyter notebooks. Furthermor
 e\, there will be a discussion of associated technical skills\, needed for
  modern collaborative research\, of training opportunities offered by the 
 Carpentries (https://carpentries.org/)\, and of collaborative translation 
 projects to maintain multi-language versions of the Carpentries training r
 esources.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Marissa Masden (University of Puget Sound)
DTSTART:20250331T020000Z
DTEND:20250331T030000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/8
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/8/">Sign Sequence Combinatorics for Topological Measures of ReLU neu
 ral networks</a>\nby Marissa Masden (University of Puget Sound) as part of
  Tropical mathematics and machine learning\n\n\nAbstract\nA (ReLU) neural 
 network is a type of piecewise linear (PL) function F which induces a cano
 nical polyhedral subdivision\, $\\mathcal C(F)$\, on its input space (Grig
 sby and Lindsey\, 2022). This class of function is commonly used in modern
  machine learning applications. Following a brief introduction to these fu
 nctions and a topological perspective on data classification\, we will the
 n discuss how ReLU networks induce a polyhedral complex on their input spa
 ce which arises from hyperplane arrangements. The face poset of this polyh
 edral complex (for a given ReLU neural network) is entirely determined by 
 combinatorial "sign sequence" information about the vertices of the comple
 x. We will explore how combinatorial properties of the face poset of this 
 polyhedral subdivision may be used to compute topological properties of a 
 given ReLU function such as its level set topology\, critical points\, and
  (most recently) a discrete gradient vector field agreeing with the functi
 on\, among other useful measures\, and demonstrate how this may be used to
  understand ReLU neural networks as a class of functions.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/8/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Priyaa Varshinee Srinivasan (Tallinn University of Technology)
DTSTART:20250310T080000Z
DTEND:20250310T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/9
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/9/">Drazin inverses in Categories</a>\nby Priyaa Varshinee Srinivasa
 n (Tallinn University of Technology) as part of Tropical mathematics and m
 achine learning\n\n\nAbstract\nIn this talk\, we will explore Drazin inver
 ses through the lens of category theory. Drazin inverses are a fundamental
  algebraic structure which have been extensively studied in semigroup theo
 ry and ring theory. Drazin inverses can also be defined for endomaps  in a
 ny category. In this talk\, we will introduce Drazin inverse of an endomap
 \, Drazin categories (categories in which every endomorphism has a Drazin 
 inverse)\, and provide various examples of such categories including the c
 ategory of matrices over a field\, and explore a few properties of these i
 nverses.\n\nThe talk will feature lots of pictures!\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/9/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Anibal Medina Mardones (Western University)
DTSTART:20250317T070000Z
DTEND:20250317T080000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/10
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/10/">What makes math problems hard for RL: a case study</a>\nby Anib
 al Medina Mardones (Western University) as part of Tropical mathematics an
 d machine learning\n\n\nAbstract\nUsing a long-standing conjecture from co
 mbinatorial group theory\, we explore\, from multiple perspectives\, the c
 hallenges of finding rare instances carrying disproportionately high rewar
 ds. Based on lessons learned in the context defined by the Andrews–Curti
 s conjecture\, we propose algorithmic enhancements and a topological hardn
 ess measure with implications for a broad class of search problems. As par
 t of our study\, we also address several open mathematical questions. Nota
 bly\, we demonstrate the length reducibility of all but two presentations 
 in the Akbulut–Kirby series (1981) and resolve various potential counter
 examples in the Miller–Schupp series (1991)\, including three infinite s
 ubfamilies.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/10/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Siddharth Pritam (Chennai Mathematical Institute)
DTSTART:20250407T060000Z
DTEND:20250407T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/11
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/11/">Classification of Temporal Graphs using Persistent Homology</a>
 \nby Siddharth Pritam (Chennai Mathematical Institute) as part of Tropical
  mathematics and machine learning\n\n\nAbstract\nTemporal graphs effective
 ly model dynamic systems by representing interactions as timestamped edges
 . However\, analytical tools for temporal graphs are limited compared to s
 tatic graphs. We propose a novel method for analyzing temporal graphs usin
 g Persistent Homology. Our approach leverages δ-temporal motifs (recurren
 t sub-graphs) to capture temporal dynamics. By evolving these motifs\, we 
 define the average filtration and compute PH on the associated clique comp
 lex. This method captures both local and global temporal structures and is
  stable with respect to reference models. We demonstrate the applicability
  of our approach to the temporal graph classification task. Experiments ve
 rify the effectiveness of our approach\, achieving over 92% accuracy\, wit
 h some cases reaching 100%. Unlike existing methods that require node clas
 ses\, our approach is node class free\, offering flexibility for a wide ra
 nge of temporal graph analysis\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/11/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Keiji Miura (Kwansei Gakuin University)
DTSTART:20250414T060000Z
DTEND:20250414T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/12
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/12/">Tropical Neural Networks and Its Applications to Classifying Ph
 ylogenetic Trees</a>\nby Keiji Miura (Kwansei Gakuin University) as part o
 f Tropical mathematics and machine learning\n\n\nAbstract\nDeep neural net
 works show great success when input vectors are in an Euclidean space. How
 ever\, those classical neural networks show a poor performance when inputs
  are phylogenetic trees\, which can be written as vectors in the tropical 
 projective torus. Here we propose tropical embedding to transform a vector
  in the tropical projective torus to a vector in the Euclidean space via t
 he tropical metric. We introduce a tropical neural network where the first
  layer is a tropical embedding layer and the following layers are the same
  as the classical ones. We prove that a tropical neural network is a unive
 rsal approximator and we derive a backpropagation rule for deep tropical n
 eural networks. Then we provide TensorFlow 2 codes for implementing a trop
 ical neural network in the same fashion as the classical one\, where the w
 eights initialization problem is considered according to the extreme value
  statistics. We apply our method to empirical data including sequences of 
 hemagglutinin for influenza virus from New York. Finally we show that a tr
 opical neural network can be interpreted as a generalization of a tropical
  logistic regression.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/12/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Bruno Gavranović
DTSTART:20250512T080000Z
DTEND:20250512T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/13
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/13/">Learning Functors using Gradient Descent</a>\nby Bruno Gavranov
 ić as part of Tropical mathematics and machine learning\n\n\nAbstract\nCy
 cleGAN is a general approach to unpaired image-to-image translation that h
 as been getting attention in recent years. Inspired by categorical databas
 e systems\, we show that CycleGAN is a "schema"\, i.e. a specific category
  presented by generators and relations\, whose specific parameter instanti
 ations are just set-valued functors on this schema. We show that enforcing
  cycle-consistencies amounts to enforcing composition invariants in this c
 ategory. We generalize the learning procedure to arbitrary such categories
  and show that a special class of functors\, rather than functions\, can b
 e learned using gradient descent. Using this framework we design a novel n
 eural network system capable of learning to insert and delete objects from
  images without paired data. We qualitatively evaluate the system on three
  different datasets and obtain promising results.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/13/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Geoffrey Cruttwell (Mount Allison University)
DTSTART:20250519T020000Z
DTEND:20250519T030000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/14
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/14/">An introduction to tangent categories</a>\nby Geoffrey Cruttwel
 l (Mount Allison University) as part of Tropical mathematics and machine l
 earning\n\n\nAbstract\nOne of the central constructions in differential ge
 ometry is the tangent bundle: the collection of all "tangent vectors" at a
 ll points of a space. In this talk\, we'll look at an axiomatization/forma
 lization of the tangent bundle called a "tangent category".  We'll start w
 ith a brief review of category theory\, then look at some of the categoric
 al structure of the tangent bundle\, and finish by discussing some of the 
 different examples of tangent categories.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/14/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Horacio Rostro-Gonzalez
DTSTART:20250901T070000Z
DTEND:20250901T080000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/15
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/15/">Canceled</a>\nby Horacio Rostro-Gonzalez as part of Tropical ma
 thematics and machine learning\n\nAbstract: TBA\n\nCanceled meeting. Semin
 ar starts next week.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/15/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Ismail Khalfaoui-Hassani (Forschungszentrum Jülich)
DTSTART:20250609T080000Z
DTEND:20250609T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/16
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/16/">Polynomial\, trigonometric\, and tropical activations</a>\nby I
 smail Khalfaoui-Hassani (Forschungszentrum Jülich) as part of Tropical ma
 thematics and machine learning\n\n\nAbstract\nWhich functions can be used 
 as activations in deep neural networks? This article explores families of 
 functions based on orthonormal bases\, including the Hermite polynomial ba
 sis and the Fourier trigonometric basis\, as well as a basis resulting fro
 m the tropicalization of a polynomial basis. Our study shows that\, throug
 h simple variance-preserving initialization and without additional clampin
 g mechanisms\, these activations can successfully be used to train deep mo
 dels\, such as GPT-2 for next-token prediction on OpenWebText and ConvNeXt
  for image classification on ImageNet. Our work addresses the issue of exp
 loding and vanishing activations and gradients\, particularly prevalent wi
 th polynomial activations\, and opens the door for improving the efficienc
 y of large-scale learning tasks. Furthermore\, our approach provides insig
 ht into the structure of neural networks\, revealing that networks with po
 lynomial activations can be interpreted as multivariate polynomial mapping
 s. Finally\, using Hermite interpolation\, we show that our activations ca
 n closely approximate classical ones in pre-trained models by matching bot
 h the function and its derivative\, making them especially useful for fine
 -tuning tasks. These activations are available in the torchortho library\,
  which can be accessed via: https://github.com/K-H-Ismail/torchortho.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/16/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Susana López-Moreno (Pusan National University)
DTSTART:20250908T060000Z
DTEND:20250908T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/17
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/17/">Poset Neural Networks</a>\nby Susana López-Moreno (Pusan Natio
 nal University) as part of Tropical mathematics and machine learning\n\n\n
 Abstract\nThe paper ''Tropical Geometry of Deep Neural Networks'' by L. Zh
 ang et al. introduces an equivalence between integer-valued neural network
 s (IVNN) with $\\text{ReLU}_{t}$ and tropical rational functions\, which c
 ome with a map to polytopes. Expanding this connection to posets\, we will
  see how neural networks are constructed from an order polytope.\nWe then 
 explain how posets with four points induce neural networks that can be int
 erpreted as $2\\times 2$ convolutional filters\, that can used in not only
  IVNNs but in any general neural network.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/17/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Aldo Guzmán-Sáenz (IBM Research)
DTSTART:20250915T000000Z
DTEND:20250915T010000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/18
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/18/">Topological Data Analysis and Machine Learning - an application
  to computational genomics</a>\nby Aldo Guzmán-Sáenz (IBM Research) as p
 art of Tropical mathematics and machine learning\n\n\nAbstract\nPersistent
  homology is a tool derived from algebraic topology that has been successf
 ully applied to real-world problems\, either on its own or combined with s
 tandard machine learning techniques. To further strengthen its applicabili
 ty\, it is necessary to establish mappings from topological features in ho
 mology to the original data space. In this talk we present one possible ap
 proach using harmonic persistent homology to identify biomarkers relevant 
 to disease subtyping.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/18/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jose Perea (Northeastern University)
DTSTART:20250929T120000Z
DTEND:20250929T130000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/19
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/19/">Learning functions on the space of persistence diagrams</a>\nby
  Jose Perea (Northeastern University) as part of Tropical mathematics and 
 machine learning\n\n\nAbstract\nThe persistence diagram is an increasingly
  useful shape descriptor from Topological Data Analysis\, but its use alon
 gside typical machine learning techniques requires mathematical finesse.  
 We will describe in this talk a mathematical framework for featurization o
 f said descriptors\, and show how it addresses the problem of approximatin
 g continuous functions on compact subsets of the space of persistence diag
 rams.  We will also show how these techniques can be applied to problems i
 n semi-supervised learning where these descriptors are relevant.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/19/
END:VEVENT
BEGIN:VEVENT
SUMMARY:John Abascal (Northeastern University)
DTSTART:20251027T000000Z
DTEND:20251027T010000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/20
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/20/">Privacy (Attacks) in Machine Learning</a>\nby John Abascal (Nor
 theastern University) as part of Tropical mathematics and machine learning
 \n\n\nAbstract\nDespite achieving state-of-the-art performance across nume
 rous domains\, deep learning models often memorize sensitive information f
 rom their training data. This leaves models and the samples they are train
 ed on vulnerable to a wide range of privacy attacks.\n\nThis talk provides
  an introduction to this landscape of privacy in machine learning\, explor
 ing risks like membership inference attacks and provable mitigations such 
 as differential privacy. Throughout the discussion\, we will examine how t
 hese risks arise\, fundamental trade-offs between privacy and model utilit
 y\, and the current state of privacy in the context of machine learning. W
 e will also highlight some open challenges and future directions in the on
 going effort to build models that are performant\, private\, and trustwort
 hy.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/20/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Timothy Duff (University of Missouri)
DTSTART:20251103T010000Z
DTEND:20251103T020000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/21
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/21/">Compatibility of Fundamental and Essential Matrix Triples</a>\n
 by Timothy Duff (University of Missouri) as part of Tropical mathematics a
 nd machine learning\n\n\nAbstract\nThe fundamental matrix of a pair of pin
 hole cameras lies at the core of systems that reconstruct 3D scenes from 2
 D images. However\, for more than two cameras\, the relations between the 
 various fundamental matrices of camera pairs are not yet completely unders
 tood. In joint work with Viktor Korotynskiy\, Anton Leykin\, and Tomas Paj
 dla\, we characterize all polynomial constraints that hold for an arbitrar
 y triple of fundamental matrices. Unlike most constraints in previous work
 s\, our constraints hold independently of the relative scaling of the fund
 amental matrices\, which is unknown in practice. We also provide a partial
  characterization for essential matrix triples arising from calibrated cam
 eras.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/21/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Chung To Kong (University of Hong Kong)
DTSTART:20251020T060000Z
DTEND:20251020T063000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/22
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/22/">The possibility of making $138\,000 from shredded banknote piec
 es using computer vision</a>\nby Chung To Kong (University of Hong Kong) a
 s part of Tropical mathematics and machine learning\n\n\nAbstract\nEvery c
 ountry must dispose of old banknotes. At the Hong Kong Monetary Authority 
 visitor center\, visitors can buy a paperweight souvenir full of shredded 
 banknotes. Even though the shredded banknotes are small\, by using compute
 r vision\, it is possible to reconstruct the whole banknote like a jigsaw 
 puzzle. Each paperweight souvenir costs \\$100 HKD\, and it is claimed to 
 contain shredded banknotes equivalent to 138 complete \\$1000 HKD banknote
 s. In theory\, \\$138\,000 HKD can be recovered by using computer vision. 
 This paper discusses the technique of collecting shredded banknote pieces 
 and applying a computer vision program.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/22/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Redi Haderi
DTSTART:20251215T100000Z
DTEND:20251215T110000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/23
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/23/">A practical introduction to operads</a>\nby Redi Haderi as part
  of Tropical mathematics and machine learning\n\n\nAbstract\nIn this talk 
 we will introduce the notion of operad as a tool to control a variety of a
 lgebraic structures. We describe the notion of algebra\, some examples of 
 interesting algebras which arise operadically and how to recognize an alge
 braic structure is operadic and the relationship between operads and monoi
 dal categories.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/23/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Redi Haderi
DTSTART:20251222T100000Z
DTEND:20251222T110000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/24
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/24/">Infinity operads via simplicial lists</a>\nby Redi Haderi as pa
 rt of Tropical mathematics and machine learning\n\n\nAbstract\nWe introduc
 e the notion of simplicial list as a combinatorial tool to understand oper
 ads and their homotopy coherent variant (i.e. infinity operads). We will f
 ocus on the analogy between simplicial sets and simplicial lists and prese
 nt a nerve theorem which recognizes operads as certain simplicial lists. T
 his leads to an interesting quasi-categorical and combinatorial notion of 
 infinity operad.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/24/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Joe Moeller (Caltech)
DTSTART:20260202T000000Z
DTEND:20260202T010000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/25
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/25/">Schur Functors and Categorified Plethysm</a>\nby Joe Moeller (C
 altech) as part of Tropical mathematics and machine learning\n\n\nAbstract
 \nIt is known that the Grothendieck group of the category of Schur functor
 s is the ring of symmetric functions. This ring has a rich structure\, muc
 h of which is encapsulated in the fact that it is a "plethory": a monoid i
 n the category of birings with its substitution monoidal structure. We sho
 w that similarly the category of Schur functors is a "2-plethory"\, which 
 descends to give the plethory structure on symmetric functions. Thus\, muc
 h of the structure of symmetric functions exists at a higher level in the 
 category of Schur functors.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/25/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Abbas Shoja-Daliklidash (University of Mohaghegh Ardabili)
DTSTART:20260105T060000Z
DTEND:20260105T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/26
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/26/">Postponed</a>\nby Abbas Shoja-Daliklidash (University of Mohagh
 egh Ardabili) as part of Tropical mathematics and machine learning\n\n\nAb
 stract\nSandpiles with finite-range interactions. We investigate the sandp
 ile model with Yukawa-type interactions\, whose effective range is tuned b
 y an external parameter R. Our results reveal that at specific values of R
 \, the system exhibits giant avalanches that span the system\, leading to 
 percolation. The probability of such giant avalanches demonstrates two dis
 tinct regimes as a function of R: for sufficiently small R\, it increases 
 monotonically\, whereas for large R it undergoes threshold dynamics\, so t
 hat at certain values of R\, the percolation probability exhibits abrupt j
 umps. We refer it to as \\textit{pseudo-percolation transitions}\, based o
 n which we propose a hierarchical percolation model at the mean-field leve
 l: each percolation transition corresponds to percolation within a disc of
  radius R. We further examine both local and global geometrical observable
 s. The local quantities include avalanche size\, mass\, and duration and s
 ub-avalanche mass\, while for the global characterization we analyze the l
 oop length and gyration radius of the external perimeter\, as well as the 
 mass of sub-avalanches. Remarkably\, all these observables exhibit power-l
 aw scaling for all values of R\, with exponents that vary systematically w
 ith R. Notably\, in the vicinity of the pseudo-percolation transition poin
 ts\, the exponents approach characteristic values\, signaling a distinct c
 ritical behavior.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/26/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jesse Wolfson (University of California\, Irvine)
DTSTART:20251208T044500Z
DTEND:20251208T054500Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/27
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/27/">Fractals and Africanist Music</a>\nby Jesse Wolfson (University
  of California\, Irvine) as part of Tropical mathematics and machine learn
 ing\n\n\nAbstract\nSymmetry has been a structuring device and motif for mu
 sic in many cultures\, and probably everyone who has taken music lessons a
 s a child can call to mind the time translation symmetry encoded by the re
 gular beats of a metronome.  Nonetheless\, the idea of fractal symmetry in
  music seems much less common.  In this talk\, I'll describe recent resear
 ch investigating the presence of fractal structures in Africanist music\, 
 in response to a hypothesis of choreographer Reggie Wilson. I'll describe 
 both the project and "scientific" findings\, as well as the broader contex
 t of my unexpected and ongoing engagement with Reggie Wilson and his Fist 
 and Heel Performance Group at the interplay of math and dance.  This is jo
 int work with Claudio Gómez-Gonzáles\, Sidhanth Raman and Siddharth Visw
 anath.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/27/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Eva Yi Xie (Princeton Neuroscience Institute\; Allen Institute)
DTSTART:20260112T000000Z
DTEND:20260112T010000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/28
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/28/">A Multi-Region Brain Model to Elucidate the Role of Hippocampus
  in Spatially Embedded Decision-Making</a>\nby Eva Yi Xie (Princeton Neuro
 science Institute\; Allen Institute) as part of Tropical mathematics and m
 achine learning\n\n\nAbstract\nBrains excel at robust decision-making and 
 data-efficient learning. Understanding the architectures and dynamics unde
 rlying these capabilities can inform inductive biases for deep learning. W
 e present a multi-region brain model that explores the normative role of s
 tructured memory circuits in a spatially embedded binary decision-making t
 ask from neuroscience. We counterfactually compare the learning performanc
 e and neural representations of reinforcement learning (RL) agents with br
 ain models of different interaction architectures between grid and place c
 ells in the entorhinal cortex and hippocampus\, coupled with an action-sel
 ection cortical recurrent neural network. We demonstrate that a specific a
 rchitecture--where grid cells receive and jointly encode self-movement vel
 ocity signals and decision evidence increments--optimizes learning efficie
 ncy while best reproducing experimental observations relative to alternati
 ve architectures.Our findings thus suggest brain-inspired structured archi
 tectures for efficient RL. Importantly\, the models make novel\, testable 
 predictions about organization and information flow within the entorhinal-
 hippocampal-neocortical circuit: we predict that grid cells must conjuncti
 vely encode position and evidence for effective spatial decision-making\, 
 directly motivating new neurophysiological experiments.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/28/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Assaf Shocher (Technion – Israel Institute of Technology)
DTSTART:20260316T080000Z
DTEND:20260316T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/29
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/29/">Teaching Neural Networks Linear Algebra</a>\nby Assaf Shocher (
 Technion – Israel Institute of Technology) as part of Tropical mathemati
 cs and machine learning\n\n\nAbstract\nNeural networks are powerful but no
 toriously difficult to analyze\, compose\, or control. Linear algebra\, by
  contrast\, is the mathematical ideal of tractability. In this talk I will
  present several works from my lab that aim to import the principles of li
 near algebra into deep learning. I begin with projection as a generative p
 rinciple: Idempotent Generative Networks (IGN) train a neural network to s
 atisfy f(f(z)) = f(z)\, so that the data manifold emerges as the set of fi
 xed points of the operator. Generation then becomes projection: a single f
 orward pass maps noise to the manifold\, while repeated application enable
 s principled refinement. We then ask a more provocative question: "Who sai
 d neural networks aren't linear?". Neural networks are famously nonlinear\
 , but nonlinear with respect to which vector spaces? Using the algebraic n
 otion of transport of structure\, the Linearizer framework identifies non-
 standard vector spaces in which a neural network acts as a linear operator
 . In these spaces\, tools such as SVD\, pseudo-inverses\, and composition 
 become directly applicable to neural networks\, with consequences ranging 
 from algebraic analysis of models to collapsing diffusion sampling into a 
 single step. Finally\, I will present recent work that generalizes the Moo
 re-Penrose pseudo-inverse to nonlinear mappings. Surjective Pseudo-inverti
 ble Neural Networks (SPNN) satisfy the classical Penrose identities by con
 struction\, enabling nonlinear back-projection and extending diffusion-bas
 ed zero-shot inverse problem solving from linear degradations to arbitrary
  nonlinear ones. These are steps we are taking towards systems that we can
  study and design with the same rigor and elegance that linear algebra bri
 ngs to the physical sciences.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/29/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Shailesh Lal (Beijing Institute of Mathematical Sciences and Appli
 cations)
DTSTART:20260309T060000Z
DTEND:20260309T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/30
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/30/">Postponed</a>\nby Shailesh Lal (Beijing Institute of Mathematic
 al Sciences and Applications) as part of Tropical mathematics and machine 
 learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/30/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Changqing Fu (CEREMADE\, Paris Dauphine University - PSL and Paris
  AI Institute (PRAIRIE))
DTSTART:20260223T080000Z
DTEND:20260223T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/31
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/31/">Transformers as Effective Fields: From Quantum Physics to AI</a
 >\nby Changqing Fu (CEREMADE\, Paris Dauphine University - PSL and Paris A
 I Institute (PRAIRIE)) as part of Tropical mathematics and machine learnin
 g\n\n\nAbstract\nApproximation and algebraic theories are not yet sufficie
 nt to prove the optimality of Transformers: it is known that even shallow 
 infinite-width neural networks are approximately universal\, and ReLU netw
 orks are within the rational function class under tropical (max-plus) alge
 bra. However\, these facts still cannot explain the effectiveness of Trans
 formers\, since a constructive proof of their form is missing.\n\nIn this 
 talk\, we propose a novel theory to fully classify all possible neural net
 works and argue that linear/softmax Transformers are optimal under several
  minimal axioms. To model the reasoning process\, we treat the neural ODE 
 as the geodesics of some canonical field\, where time represents layer dep
 th. To model the interaction among concepts\, we pass from the vector flow
  to the matrix flow\, denoted as $\\bm X$\, whose rows are tokens and colu
 mns are neurons. The Transformer is then a natural consequence:\n\nLinear 
 Attention: the first interaction term under left unitary invariance.\n\nSo
 ftmax Attention: the entropic regularization of the field under left permu
 tation invariance.\n\nTwo-Layer ReLU Network: the projected gradient flow\
 , where the feasible set is conic and permutation invariant. \n\nGated Act
 ivation Network: the minimal nonlinear non-interactive field under left pe
 rmutation invariance.\n\nSparse Attention*: the token-pairwise non-commuta
 tive correction that leads to a mask on the attention matrix.\n\nIn conclu
 sion\, we provide a theoretical proof for why the “bitter lesson” hold
 s\, a theoretical guarantee for the technical path of Transformers\, and a
  paradigm to study the interpretability of intelligence.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/31/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Tom Jacobs (CISPA Helmholtz Center for Information Security)
DTSTART:20260209T080000Z
DTEND:20260209T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/32
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/32/">Weight Decay Controls Implicit Regularization:  Insights on Gen
 eralization and Sparsity</a>\nby Tom Jacobs (CISPA Helmholtz Center for In
 formation Security) as part of Tropical mathematics and machine learning\n
 \n\nAbstract\nClassical statistics teaches us that overparameterization ca
 uses overfitting\, which prevents good generalization. However\, highly ov
 erparameterized neural network architectures generalize surprisingly well.
  This is because the training of these models tends towards low rank or sp
 arse solutions\, without requiring explicit constraints. This preference i
 s known as implicit regularization\, and it can be found in a variety of c
 ontexts\, including attention layers\, LoRA\, matrix sensing\, and diagona
 l linear networks. As a result\, implicit regularization helps explain how
  overfitting is avoided and generalization is improved in neural networks.
 \n\nIn this work I will show how weight decay controls implicit regulariza
 tion beyond its explicit role of constraining the model capacity. For inst
 ance\, it moves the implicit regularizer from $L_2$ to $L_1$\, which leads
  to more sparsity in the model. This demonstrates how weight decay not onl
 y serves as a model constraint\, but also has an implicit effect. By turni
 ng off weight decay during training\, only the implicit effect remains\, r
 esulting in better generalization overall. Besides better generalization\,
  I use these insights to induce sparsity in deep neural networks. Sparsity
  aims to reduce model size and inference time by removing as many weights 
 as possible. This results in a new method: PILoT (Parameteric Implicit Lot
 tery Ticket)(Our previous work)\, a sparsification method based on overpar
 ameterization and weight decay that uses the transition of the implicit re
 gularization from $L_2$ to $L_1$ to gradually sparsify\, achieving high sp
 arsity with a smaller performance drop.\n\nTheoretically\, we use and exte
 nd the connection between reparameterizations (specific overparameterizati
 on) and mirror flows (Riemannian gradient flow) and extend this to time-va
 rying mirror flows. The mirror flow controls the implicit bias and with th
 at the weight decay controls the time-varying mirror flow.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/32/
END:VEVENT
BEGIN:VEVENT
SUMMARY:José Simental Rodríguez (UNAM)
DTSTART:20260323T010000Z
DTEND:20260323T020000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/33
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/33/">Big hypercubes in the Bruhat order</a>\nby José Simental Rodr
 íguez (UNAM) as part of Tropical mathematics and machine learning\n\n\nAb
 stract\nThe Bruhat order is a basic structure on the symmetric group (or m
 ore generally\, any Coxeter group) that is surprisingly still very poorly 
 understood. Using permutations suggested by AlphaEvolve (an evolutionary c
 oding agent of Google DeepMind) we find an interval in the Bruhat order wh
 ose elements can be explicitly described in terms of binary expansions and
  that\, moreover\, form a hypercube of dimension much larger than one coul
 d initially expect can occur within the symmetric group. In fact\, the dim
 ension of this hypercube matches\, up to a constant\, the dimension of the
  largest possible hypercube that can appear as an interval in the symmetri
 c group. Time permitting\, I will elaborate on the consequences that the e
 xistence of these big hypercubes has on the recently discovered cluster st
 ructures on open Richardson varieties. This is joint work with Jordan Elle
 nberg\, Nicolás Libedinsky\, David Plaza\, and Geordie Williamson.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/33/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Yu Tian (Max Planck Institute for Physics of Complex Systems (MPI 
 PKS))
DTSTART:20260511T080000Z
DTEND:20260511T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/34
DESCRIPTION:by Yu Tian (Max Planck Institute for Physics of Complex System
 s (MPI PKS)) as part of Tropical mathematics and machine learning\n\nAbstr
 act: TBA\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/34/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Austin Rodriguez (Michigan State University)
DTSTART:20260330T010000Z
DTEND:20260330T020000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/35
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/35/">Projected Hessian Learning: Scalable Curvature-Aware Training f
 or Reactive Machine Learning Interatomic Potentials</a>\nby Austin Rodrigu
 ez (Michigan State University) as part of Tropical mathematics and machine
  learning\n\n\nAbstract\nMachine learning interatomic potentials (MLIPs) a
 re transforming reactive chemistry\, materials discovery\, and molecular d
 esign by enabling near-quantum-chemical predictions of potential energies 
 and forces at far lower cost than direct electronic structure calculations
 . While training to energies and forces improves the description of potent
 ial energy surfaces (PESs)\, these quantities alone do not fully capture l
 ocal curvature. The Hessian matrix\, which contains second-derivative info
 rmation\, provides a much richer description of PES topology and can impro
 ve extrapolation to nonequilibrium geometries\, reaction pathway modeling\
 , transition-state characterization\, vibrational analysis\, molecular dyn
 amics\, and nudged elastic band calculations. However\, conventional Hessi
 an training is often impractical because explicit construction and storage
  of Hessian matrices scale quadratically in memory and computational cost.
 \nTo address this limitation\, we introduce Projected Hessian Learning (PH
 L)\, a scalable second-order training framework that incorporates curvatur
 e information through Hessian-vector products (HVPs) rather than full Hess
 ian matrices. PHL uses stochastic probe directions and an unbiased trace-b
 ased loss to inject second-order information with favorable scaling\, avoi
 ding the prohibitive cost of explicit Hessian supervision. We evaluate thi
 s approach on a chemically diverse reactive dataset containing reactants\,
  products\, transition states\, intrinsic reaction coordinate geometries\,
  and normal-mode sampled structures computed at the ωB97X-D/6-31G(d) leve
 l of theory. Models trained only on equilibrium geometries and first-order
  saddle points are assessed for their ability to extrapolate to nonequilib
 rium configurations. Compared with conventional energy-force training\, cu
 rvature-informed models substantially improve predictions of energies\, fo
 rces\, and Hessian-related properties for unseen geometries\, while also r
 educing the amount of training data required to achieve strong performance
 . Moreover\, randomized HVP-based PHL schemes recover nearly all the benef
 its of full Hessian training while achieving substantial speedups and avoi
 ding quadratic memory growth. These results show that PHL provides a pract
 ical route to scalable second-order MLIP training\, retaining the accuracy
  and transferability benefits of Hessian information while extending curva
 ture-aware learning to larger and more complex molecular systems.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/35/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Hana Dal Poz Kouřimská (University of Potsdam)
DTSTART:20260427T070000Z
DTEND:20260427T080000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/36
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/36/">Topological data analysis: theory\, applications\, and generali
 zations</a>\nby Hana Dal Poz Kouřimská (University of Potsdam) as part o
 f Tropical mathematics and machine learning\n\n\nAbstract\nTopological Dat
 a Analysis (TDA) is a rapidly growing field at the intersection of mathema
 tics and computer science. Its popularity stems from a unique ability to e
 xtract robust geometric features—essentially the "shape" of the data—f
 rom complex\, high-dimensional datasets. \n\nIn this talk\, I will walk yo
 u through the theoretical basics of TDA. We will explore what makes it suc
 h a powerful tool\, while also discussing the current methodological and c
 omputational challenges the field faces. Building on this foundation\, we 
 will dive into a selection of concrete applications to see how TDA is used
  in practice. \n\nFinally\, I will sketch a modern generalization of TDA t
 hat bridges the gap between pure topology\, statistics\, and machine learn
 ing\, offering a glimpse into where the field is heading next.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/36/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sherkhon Azimov (Pusan National University)
DTSTART:20260406T060000Z
DTEND:20260406T070000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/37
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/Tropicalmath
 andML/37/">Adaptive Nonlinear Vector Autoregression: Robust Forecasting fo
 r Noisy Chaotic Time Series</a>\nby Sherkhon Azimov (Pusan National Univer
 sity) as part of Tropical mathematics and machine learning\n\n\nAbstract\n
 Nonlinear vector autoregression (NVAR) and reservoir computing (RC) have s
 hown promise in forecasting chaotic dynamical systems\, such as the Lorenz
 -63 model and El Nino-Southern Oscillation. However\, their reliance on fi
 xed nonlinear transformations - polynomial expansions in NVAR or random fe
 ature maps in RC - limits their adaptability to high noise or complex real
 -world data. Furthermore\, these methods also exhibit poor scalability in 
 high-dimensional settings due to costly matrix inversion during optimizati
 on. We propose a data-adaptive NVAR model that combines delay-embedded lin
 ear inputs with features generated by a shallow\, trainable multilayer per
 ceptron (MLP). Unlike standard NVAR and RC models\, the MLP and linear rea
 dout are jointly trained using gradient-based optimization\, enabling the 
 model to learn data-driven nonlinearities\, while preserving a simple read
 out structure and improving scalability. Initial experiments across multip
 le chaotic systems\, tested under noise-free and synthetically noisy condi
 tions\, showed that the adaptive model outperformed in predictive accuracy
  the standard NVAR\, a leaky echo state network (ESN) - the most common RC
  model - and a hybrid ESN\, thereby showing robust forecasting under noisy
  conditions.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/37/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Shailesh Lal (Beijing Institute of Mathematical Sciences and Appli
 cations)
DTSTART:20260602T070000Z
DTEND:20260602T080000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/38
DESCRIPTION:by Shailesh Lal (Beijing Institute of Mathematical Sciences an
 d Applications) as part of Tropical mathematics and machine learning\n\nAb
 stract: TBA\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/38/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Kelly Maggs (Max Planck Institute of Molecular Cell Biology and Ge
 netics)
DTSTART:20260608T080000Z
DTEND:20260608T090000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/39
DESCRIPTION:by Kelly Maggs (Max Planck Institute of Molecular Cell Biology
  and Genetics) as part of Tropical mathematics and machine learning\n\n\nA
 bstract\nSingle-cell sequencing data consists of a point cloud where the p
 oints are cells\, with coordinates RNA expression levels in each gene. Sin
 ce the tissue is destroyed by the sequencing procedure\, the dynamics of g
 ene expression must be inferred from the structure and geometry of the poi
 nt cloud. In this talk\, we will build a biological interpretation of the 
 one-dimensional cohomology classes in hallmark gene subsets as models for 
 biological processes. Such processes include the cell-cycle\, but more gen
 erally model homeostatic negative feedback loops. Our procedure uses persi
 stent cohomology to identify features\, and integration of differential fo
 rms to estimate the cascade of genes associated with the underlying dynami
 cs of gene expression.\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/39/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Helen Jenne (Pacific Northwest National Laboratory)
DTSTART:20260601T010000Z
DTEND:20260601T020000Z
DTSTAMP:20260422T212610Z
UID:TropicalmathandML/40
DESCRIPTION:by Helen Jenne (Pacific Northwest National Laboratory) as part
  of Tropical mathematics and machine learning\n\nAbstract: TBA\n
LOCATION:https://researchseminars.org/talk/TropicalmathandML/40/
END:VEVENT
END:VCALENDAR
