Exploring group equivariant neural networks using set partition diagrams

Edward Pearce-Crump (Imperial)

21-Jun-2023, 09:00-10:00 (3 years ago)

Abstract: What do jellyfish and an 11th century Japanese novel have to do with neural networks? In recent years, much attention has been given to developing neural network architectures that can efficiently learn from data with underlying symmetries. These architectures ensure that the learned functions maintain a certain geometric property called group equivariance, which determines how the output changes based on a change to the input under the action of a symmetry group. In this talk, we will describe a number of new group equivariant neural network architectures that are built using tensor power spaces of $\mathbb{R}^n$ as their layers. We will show that the learnable, linear functions between these layers can be characterised by certain subsets of set partition diagrams. This talk will be based on several papers that are to appear in ICML 2023.

machine learningmathematical physicscommutative algebraalgebraic geometryalgebraic topologycombinatoricsdifferential geometrynumber theoryrepresentation theory

Audience: researchers in the topic


Machine Learning Seminar

Series comments: Online machine learning in pure mathematics seminar, typically held on Wednesday. This seminar takes place online via Zoom.

For recordings of past talks and copies of the speaker's slides, please visit the seminar homepage at: kasprzyk.work/seminars/ml.html

Organizers: Alexander Kasprzyk*, Lorenzo De Biase*, Tom Oliver, Sara Veneziale
*contact for this listing

Export talk to