BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:M. Graham\, A. Linot (Univ. of Wisconsin-Madison)
DTSTART:20211020T190000Z
DTEND:20211020T200000Z
DTSTAMP:20260422T225843Z
UID:CNSwebinar/1
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/CNSwebinar/1
 /">Data-driven dimension reduction\, dynamic modeling\, and control of com
 plex chaotic systems</a>\nby M. Graham\, A. Linot (Univ. of Wisconsin-Madi
 son) as part of Georgia Tech CNS Nonlinear Webinar\n\n\nAbstract\nOur over
 all aim is to combine ideas from dynamical systems theory and machine lear
 ning to develop and apply reduced-order models of flow processes with comp
 lex chaotic dynamics. A particular aim is a minimal description of dynamic
 s on manifolds of dimension much less than the nominal state dimension and
  use of these models to develop effective control strategies for reducing 
 energy dissipation.\n\nAlec Linot: Modeling chaotic spatiotemporal dynamic
 s with a minimal representation using Neural ODEs\n\nSolutions to dissipat
 ive partial differential equations that exhibit chaotic dynamics often evo
 lve to attractors that exist on finite-dimensional manifolds. We describe 
 a data-driven reduced order modelling (ROM) method to find the coordinates
  on this manifold and find an ordinary differential equation (ODE) in thes
 e coordinates. The manifold coordinates are found by reducing the system d
 imension via an undercomplete autoencoder – a neural network that reduce
 s then expands dimension – and an ODE is learned in this coordinate syst
 em with a Neural ODE. Learning an ODE\, instead of a discrete time-map\, a
 llows us to evolve trajectories arbitrarily far forward\, and allows for t
 raining on unevenly and/or widely spaced data in time. We test on the Kura
 moto-Sivashinsky equation for domain sizes that exhibit spatiotemporally c
 haos\, and find the ROM gives accurate short- and long-time statistics wit
 h training data separated up to 0.7 Lyapunov times.\nhttps://arxiv.org/abs
 /2109.00060\n
LOCATION:https://researchseminars.org/talk/CNSwebinar/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:K. Zeng\, Carlo Perez de Jesus\, Daniel Floryan (Univ. of Wisconsi
 n-Madison\, Univ. of Houston)
DTSTART:20211027T190000Z
DTEND:20211027T200000Z
DTSTAMP:20260422T225843Z
UID:CNSwebinar/2
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/CNSwebinar/2
 /">Charting dynamics from data</a>\nby K. Zeng\, Carlo Perez de Jesus\, Da
 niel Floryan (Univ. of Wisconsin-Madison\, Univ. of Houston) as part of Ge
 orgia Tech CNS Nonlinear Webinar\n\n\nAbstract\nKevin Zeng: Deep Reinforce
 ment Learning Using Data-Driven Reduced-Order Models Discovers and Stabili
 zes Low Dissipation Equilibria\n\nDeep reinforcement learning (RL)\, a dat
 a-driven method capable of discovering complex control strategies for high
 -dimensional systems\, requires substantial interactions with the target s
 ystem\, making it costly when the system is computationally or experimenta
 lly expensive (e.g. flow control). We mitigate this challenge by combining
  dimension reduction via an autoencoder with a neural ODE framework to lea
 rn a low-dimensional dynamical model\, which we substitute in place of the
  true system during RL training to efficiently estimate the control policy
 . We apply our method to data from the Kuramoto-Sivashinsky equation. With
  a goal of minimizing dissipation\, we extract control policies from the m
 odel using RL and show that the model-based strategies perform well on the
  full dynamical system and highlight that the RL agent discovers and stabi
 lizes a forced equilibrium solution\, despite never having been given expl
 icit information about this state’s existence.\nhttps://arxiv.org/abs/21
 04.05437\n\nCarlo Perez de Jesus\nDept. of Chemical and Biological Enginee
 ring\, Univ. of Wisconsin-Madison\n\nData-driven estimation of inertial ma
 nifold dimension for chaotic Kolmogorov flow and time evolution on the man
 ifold\n\nModel reduction techniques have previously been applied to evolve
  the Navier-Stokes equations in time\, however finding the minimal dimensi
 on needed to correctly capture the key dynamics is not a trivial task. To 
 estimate this dimension we trained an undercomplete autoencoder on weakly 
 chaotic vorticity data (32x32 grid) from Kolmogorov flow simulations\, tra
 cking the reconstruction error as a function of dimension. We also trained
  a discrete time stepper that evolves the reduced order model with a nonli
 near dense neural network. The trajectory travels in the vicinity of relat
 ive periodic orbits (RPOs) followed by sporadic bursting events. At a dime
 nsion of five (as opposed to the full state dimension of 1024)\, power inp
 ut-dissipation probability density function is well-approximated\; Fourier
  coefficient evolution shows that the trajectory correctly captures the he
 teroclinic connections (bursts) between the different RPOs\, and the predi
 ction and true data track each other for approximately a Lyapunov time.\n\
 nDaniel Floryan https://dfloryan.github.io/\nMechanical Engineering at the
  University of Houston\n\nCharting dynamics from data\n\nWe often find our
 selves working with systems for which governing equations are unknown\, or
  if they are known\, they may be high-dimensional to the point of being di
 fficult to analyze and prohibitively expensive to make predictions with. T
 hese difficulties\, together with the ever-increasing availability of data
 \, have led to the new paradigm of data-driven model discovery. I will pre
 sent recent work that fruitfully combines a classical idea from applied ma
 thematics with modern methods of machine learning to learn minimal dynamic
 al models directly from time series data. In full analogy with cartography
 \, we learn a representation of a system as an atlas of charts. This appro
 ach allows us to obtain dynamical models of the lowest possible dimension\
 , leads to computational benefits\, and can separate state space into regi
 ons of distinct behaviors.\n              https://arxiv.org/abs/2108.05928
 \n
LOCATION:https://researchseminars.org/talk/CNSwebinar/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Vladimir Rosenhaus
DTSTART:20220412T150000Z
DTEND:20220412T160000Z
DTSTAMP:20260422T225843Z
UID:CNSwebinar/3
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/CNSwebinar/3
 /">Feynman rules for  wave turbulence</a>\nby Vladimir Rosenhaus as part o
 f Georgia Tech CNS Nonlinear Webinar\n\n\nAbstract\nIt has long been known
  that weakly nonlinear field theories can have a late-time stationary stat
 e that is not the thermal state\, but a wave turbulent state (the Kolmogor
 ov-Zakharov state) with a far-from-equilibrium cascade of energy. We go be
 yond the existence of the wave turbulent state\, studying fluctuations abo
 ut the wave turbulent state. Specifically\, we take a classical field theo
 ry with an arbitrary quartic interaction and add dissipation and Gaussian-
 random forcing. Employing the path integral relation  between stochastic c
 lassical field theories and quantum field theories\, we give a prescriptio
 n\, in terms of  Feynman diagrams\, for computing correlation functions in
  this system.  We explicitly compute the two-point and four-point function
 s of the field to next-to-leading order in the coupling. Through an approp
 riate choice of forcing and dissipation\, these correspond to correlation 
 functions in the wave turbulent state. As a check\, we  reproduce the next
 -to-leading order term in the kinetic equation.  The correlation functions
  and corrections to the KZ state that we compute should\, in principle\, b
 e experimentally measurable quantities.\n
LOCATION:https://researchseminars.org/talk/CNSwebinar/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Zeb Rocklin
DTSTART:20220419T150000Z
DTEND:20220419T160000Z
DTSTAMP:20260422T225843Z
UID:CNSwebinar/4
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/CNSwebinar/4
 /">Rigidity percolation in a random tensegrity via analytic graph theory</
 a>\nby Zeb Rocklin as part of Georgia Tech CNS Nonlinear Webinar\n\n\nAbst
 ract\nTensegrities are mechanical structures that include cable-like eleme
 nts that are strong and lightweight relative to rigid rods yet support onl
 y extensile stress. From suspension bridges to the musculoskeletal system 
 to individual biological cells\, humanity makes excellent use of tensegrit
 ies\, yet the sharply nonlinear response of cables presents serious challe
 nges to analytical theory. Here we consider large tensegrity structures wi
 th randomly placed cables (and struts) overlaid on a regular rigid backbon
 e whose corresponding system of inequalities is reduced via analytic theor
 y to an exact graph theory. We identify a novel coordination number that c
 ontrols two rigidity percolation transitions: one in which global interact
 ions between cables first support external loads and one in which the stru
 cture becomes fully rigid.  We show that even the addition of a few cables
  strongly modifies conventional rigidity percolation\, both by modifying t
 he sharpness of the transition and by introducing avalanche effects in whi
 ch a single constraint can eliminate multiple floppy modes.\n
LOCATION:https://researchseminars.org/talk/CNSwebinar/4/
END:VEVENT
END:VCALENDAR
