Sign Sequence Combinatorics for Topological Measures of ReLU neural networks
Marissa Masden (University of Puget Sound)
Abstract: A (ReLU) neural network is a type of piecewise linear (PL) function F which induces a canonical polyhedral subdivision, $\mathcal C(F)$, on its input space (Grigsby and Lindsey, 2022). This class of function is commonly used in modern machine learning applications. Following a brief introduction to these functions and a topological perspective on data classification, we will then discuss how ReLU networks induce a polyhedral complex on their input space which arises from hyperplane arrangements. The face poset of this polyhedral complex (for a given ReLU neural network) is entirely determined by combinatorial "sign sequence" information about the vertices of the complex. We will explore how combinatorial properties of the face poset of this polyhedral subdivision may be used to compute topological properties of a given ReLU function such as its level set topology, critical points, and (most recently) a discrete gradient vector field agreeing with the function, among other useful measures, and demonstrate how this may be used to understand ReLU neural networks as a class of functions.
machine learningmathematical physicsalgebraic geometrycombinatorics
Audience: learners
Tropical mathematics and machine learning
Series comments: Besides ML and tropical math, anything tech+math is welcome.
| Organizer: | Eric Dolores-Cuenca* |
| *contact for this listing |
