Modern Hopfield Networks in AI and Neurobiology

Dmitry Krotov (Watson AI Lab and IBM Research in Cambridge)

14-Apr-2022, 16:00-17:00 (2 years ago)

Abstract:

Modern Hopfield Networks or Dense Associative Memories are recurrent neural networks with fixed point attractor states that are described by an energy function. In contrast to conventional Hopfield Networks, their modern versions have a very large memory storage capacity, which makes them appealing tools for many problems in machine learning and cognitive and neuro-sciences. In this talk I will introduce an intuition and a mathematical formulation of this class of models, and will give examples of problems in AI that can be tackled using these new ideas. I will also explain how different individual models of this class (e.g. hierarchical memories, attention mechanism in transformers, etc.) arise from their general mathematical formulation with the Lagrangian functions.

References:

  1. D.Krotov, J.Hopfield, "Dense associative memory for pattern recognition"
  2. D.Krotov, J.Hopfield, "Large Associative Memory Problem in Neurobiology and Machine Learning"
  3. M.Demircigil, et al., "On a model of associative memory with huge storage capacity"
  4. H.Ramsauer, et al., "Hopfield Networks is All You Need"
  5. D.Krotov, "Hierarchical Associative Memory"

 

data structures and algorithmsmachine learningmathematical physicsinformation theoryoptimization and controldata analysis, statistics and probability

Audience: researchers in the topic


Mathematics, Physics and Machine Learning (IST, Lisbon)

Series comments: To receive the series announcements, please register in:
mpml.tecnico.ulisboa.pt
mpml.tecnico.ulisboa.pt/registration
Zoom link: videoconf-colibri.zoom.us/j/91599759679

Organizers: Mário Figueiredo, Tiago Domingos, Francisco Melo, Jose Mourao*, Cláudia Nunes, Yasser Omar, Pedro Alexandre Santos, João Seixas, Cláudia Soares, João Xavier
*contact for this listing

Export talk to