Modern Hopfield networks

9 December 2020
7 to 8 PM CET

Recent ground-breaking research by Sepp Hochreiter, John Hopfield, and Dmitry Krotov reignited the interest in modern Hopfield networks and associative memories.

The paper “Hopfield Networks is All You Need” by Sepp Hochreiter’s team introduced a continuous modern Hopfield network with exponential storage capacity and very fast convergence. The authors showed that the update rule of the new Hopfield network is equivalent to the attention mechanism of the highly successful Transformer architecture.

John Hopfield and Dmitry Krotov then showed in their work on “Large Associative Memory Problem in Neurobiology and Machine Learning” that dense associative memories and modern Hopfield networks can be linked to models with biologically plausible pair-wise interactions between neurons.

Together, these recent hallmark studies are important for understanding Transformers, improving attention mechanisms, while providing a link of these advanced architectures to biologically plausible neural networks.

We are looking forward to these pioneers of AI exploring the implications of their findings from a theoretical, a biological, and a physics perspectives, and a discussion of future directions of research.

©2023 IARAI - INSTITUTE OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE

Imprint | Privacy Policy

Stay in the know with developments at IARAI

We can let you know if there’s any

updates from the Institute.
You can later also tailor your news feed to specific research areas or keywords (Privacy)
Loading

Log in with your credentials

or    

Forgot your details?

Create Account