Discrete neural nets and polymorphic learning

Charlotte Aten (Denver)

06-Sep-2023, 14:00-15:00 (2 years ago)

Abstract: Classical neural network learning techniques have primarily been focused on optimization in a continuous setting. Early results in the area showed that many activation functions could be used to build neural nets that represent any function, but of course this also allows for overfitting. In an effort to ameliorate this deficiency, one seeks to reduce the search space of possible functions to a special class which preserves some relevant structure. I will propose a solution to this problem of a quite general nature, which is to use polymorphisms of a relevant discrete relational structure as activation functions. I will give some concrete examples of this, then hint that this specific case is actually of broader applicability than one might guess.

machine learningmathematical physicscommutative algebraalgebraic geometryalgebraic topologycombinatoricsdifferential geometrynumber theoryrepresentation theory

Audience: researchers in the topic


Machine Learning Seminar

Series comments: Online machine learning in pure mathematics seminar, typically held on Wednesday. This seminar takes place online via Zoom.

For recordings of past talks and copies of the speaker's slides, please visit the seminar homepage at: kasprzyk.work/seminars/ml.html

Organizers: Alexander Kasprzyk*, Lorenzo De Biase*, Tom Oliver, Sara Veneziale
*contact for this listing

Export talk to