BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:George Em Karniadakis (Brown University)
DTSTART:20200701T151000Z
DTEND:20200701T160000Z
DTSTAMP:20260423T022713Z
UID:SciDL/2
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SciDL/2/">De
 epOnet: Learning nonlinear operators based on the universal approximation 
 theorem of operators</a>\nby George Em Karniadakis (Brown University) as p
 art of Workshop on Scientific-Driven Deep Learning (SciDL)\n\n\nAbstract\n
 It is widely known that neural networks (NNs) are universal approximators 
 of continuous functions\, however\, a less known but powerful result is th
 at a NN with a single hidden layer can approximate accurately any nonlinea
 r continuous operator. This universal approximation theorem of operators i
 s suggestive of the potential of NNs in learning from scattered data any c
 ontinuous operator or complex system. To realize this theorem\, we design 
 a new NN with small generalization error\, the deep operator network (Deep
 ONet)\, consisting of a NN for encoding the discrete input function space 
 (branch net) and another NN for encoding the domain of the output function
 s (trunk net). We demonstrate that DeepONet can learn various explicit ope
 rators\, e.g.\, integrals and fractional Laplacians\, as well as implicit 
 operators that represent deterministic and stochastic differential equatio
 ns. We study\, in particular\, different formulations of the input functio
 n space and its effect on the generalization error.\n
LOCATION:https://researchseminars.org/talk/SciDL/2/
END:VEVENT
END:VCALENDAR
