Skip to main content

Universal Hopfield Networks: A General Framework for Single−Shot Associative Memory Models

Beren Millidge‚ Tommaso Salvatori‚ Yuhang Song‚ Thomas Lukasiewicz and Rafal Bogacz

Abstract

A large number of neural network models of associative (semantically indexed) memories have been proposed in the literature, such as the classical Hopfield Networks (HN) and Sparse Distributed Memories (SDM), and more recently the Modern Continuous Hopfield Network (MCHN) which has been shown to have close links with self-attention mechanisms in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: Similarity, Separation, and Projection and derive all of these network as simply instances of this general framework with differing similarity and separation functions. We then extend the mathematical framework of Krotov and Hopfield (2020) to express general associative memory models using neural network dynamics with only second order interactions between neurons, and derive a general energy function which is a Lyapunov function of the dynamics. Finally, using our framework, we theoretically and empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the typical dot product similarity measure, and demonstrate that empirically Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling more robust retrieval and higher memory capacity than existing models.

Book Title
Proceedings of the 39th International Conference on Machine Learning‚ ICML 2022‚ Baltimore‚ Maryland‚ USA‚ 17−23 July 2022
Editor
Chaudhuri‚ Kamalika and Jegelka‚ Stefanie and Song‚ Le and Szepesvari‚ Csaba and Niu‚ Gang and Sabato‚ Sivan
Month
July
Pages
15561–15583
Publisher
PMLR
Series
Proceedings of Machine Learning Research
Volume
162
Year
2022