Graph Nets: The Next Generation
Max Welling (University of Amsterdam)
Abstract: In this talk I will introduce our next generation of graph neural networks. GNNs have the property that they are invariant to permutations of the nodes in the graph and to rotations of the graph as a whole. We claim this is unnecessarily restrictive and in this talk we will explore extensions of these GNNs to more flexible equivariant constructions. In particular, Natural Graph Networks for general graphs are globally equivariant under permutations of the nodes but can still be executed through local message passing protocols. Our mesh-CNNs on manifolds are equivariant under SO(2) gauge transformations and as such, unlike regular GNNs, entertain non-isotropic kernels. And finally our SE(3)-transformers are local message passing GNNs, invariant to permutations but equivariant to global SE(3) transformations. These developments clearly emphasize the importance of geometry and symmetries as design principles for graph (or other) neural networks.
Joint with: Pim de Haan and Taco Cohen (Natural Graph Networks) Pim de Haan, Maurice Weiler and Taco Cohen (Mesh-CNNs) Fabian Fuchs and Daniel Worrall (SE(3)-Transformers)
bioinformaticsgame theoryinformation theorymachine learningneural and evolutionary computingclassical analysis and ODEsoptimization and controlstatistics theory
Audience: researchers in the topic
IAS Seminar Series on Theoretical Machine Learning
Series comments: Description: Seminar series focusing on machine learning. Open to all.
Register in advance at forms.gle/KRz8hexzxa5P4USr7 to receive Zoom link and password. Recordings of past seminars can be found at www.ias.edu/video-tags/seminar-theoretical-machine-learning
| Organizers: | Ke Li*, Sanjeev Arora |
| *contact for this listing |
