A Theory of AI
Hopfield networks are associative memory models that are used to store and retrieve patterns. Classical Hopfield networks are binary and have a limited storage capacity. Discrete modern Hopfield networks have been significantly improved in their properties and performance. Continuous modern Hopfield networks are generalized from binary to continuous patterns and have exponential storage capacity and fast convergence. In this model, the update rule is equivalent to the attention mechanism of the Transformer architecture. Hopfield layer can be integrated into deep learning architectures and used as pooling, LSTM, and attention layers, and many more.