Continuous-in-Depth Neural Networks through Interpretation of Learned Dynamics

Alejandro Queiruga (Google, LLC)

01-Jul-2020, 16:25-16:50 (6 years ago)

Abstract: Data-driven learning of dynamical systems is of interest to the scientific community, which wants to recover information about the true physics from the discretized model, and the machine learning community, which wants to improve model interpretability and performance. We present a refined interpretation of learned dynamical models by investigating canonical systems. Recent ML literature draws a metaphor between residual components of neural networks and a forward Euler time integrator, but we show that these components actually learn a more accurate integrator. We examine, the harmonic oscillator, 1D wave equation, and the pendulum in two forms, using purely linear models, feed-forward shallow neural networks, and neural networks embedded in time integrators. Each of the model configurations overfit to a better operator than commonly understood, confounding recovery of physics and attempts to improve the algorithms. We show two analytical methods for reconstructing underlying operators from linear systems. For the nonlinear problems, unmodified neural networks outperform the expected numerical methods, but do not allow for inspection or generalization. Embedding the models in integrators such as RK4 improves performance and generalizability. However, for the constrained pendulum, the model is still better than excepted, exhibiting better than expected stiffness-stability. We conclude by revisiting the components of neural networks where improvements are suggested.

machine learningdynamical systemsapplied physics

Audience: researchers in the topic


Workshop on Scientific-Driven Deep Learning (SciDL)

Series comments: When: 8:00-14:30pm (PST) on Wednesday July 1, 2020 Where: berkeley.zoom.us/j/95609096856 Details: scidl.netlify.app/

Organizers: N. Benjamin Erichson*, Michael Mahoney, Steven Brunton, Nathan Kutz
*contact for this listing

Export talk to