Modern Hopfield networks

20 July 2021
18:00 to 19:15 PM CET

We propose a new paradigm for deep learning by equipping each layer of a deep-learning architecture with modern Hopfield networks. The new paradigm comprises functionalities like pooling, memory, and attention for each layer. Recently, we saw a renaissance of Hopfield Networks with tremendously increased storage capacity and convergence in one update step while ensuring global convergence a local energy minimum. Surprisingly, the transformer attention mechanism is equal to modern Hopfield Networks. In layers of deep learning architectures, they allow the storage of, and the access to, raw input data, intermediate results, reference data, or learned prototypes. These Hopfield layers enable new ways of deep learning and provide pooling, memory, neighbour, set association, and attention mechanisms. We apply deep networks with Hopfield layers to various domains, where they improve the state of the art on different tasks and for numerous benchmarks.

Dr Sepp Hochreiter is a pioneer in the field of Artificial Intelligence (AI). He was the first to identify the key obstacle to Deep Learning and then discovered a general approach to address this challenge. He thus became the founding father of modern Deep Learning and AI.

Sepp Hochreiter is a founding director of IARAI, a professor at Johannes Kepler University Linz and a recipient of the 2020 IEEE Neural Networks Pioneer Award.

In a recent groundbreaking paper “Hopfield Networks is All You Need“, Sepp Hochreiter’s team introduced a new modern Hopfield network with continuous states that can store exponentially many patterns and has a very fast convergence.

Here is the link to his publications.


Imprint | Privacy Policy

Stay in the know with developments at IARAI

We can let you know if there’s any

updates from the Institute.
You can later also tailor your news feed to specific research areas or keywords (Privacy)

Log in with your credentials


Forgot your details?

Create Account