Skip to main content

Recurrent predictive coding models for associative memory employing covariance learning

Mufeng Tang‚ Tommaso Salvatori‚ Beren Millidge‚ Yuhang Song‚ Thomas Lukasiewicz and Rafal Bogacz

Abstract

The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the mostly studied topics in computational and theoretical neuroscience. Classical models of the hippocampal network assume that AM is performed via a form of covariance learning, where associations between memorized items are represented by entries in the learned covariance matrix encoded in the recurrent connections in the hippocampal subfield CA3. On the other hand, it has been recently proposed that AM in the hippocampus is achieved through predictive coding. Hierarchical predictive coding models following this theory perform AM, but fail to capture the recurrent hippocampal structure that encodes the covariance in the classical models. Such a dichotomy pose potential difficulties for developing a unitary theory of how memory is formed and recalled in the hippocampus. Earlier predictive coding models that learn the covariance information of inputs explicitly seem to be a solution to this dichotomy. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism employed by the hippocampus during memory formation and recall, which unifies predictive coding and covariance learning based on the recurrent network structure.Author summary The hippocampus and adjacent cortical areas have long been considered essential for the formation of associative memories. Earlier theoretical works have assumed that the hippocampus stores in its recurrent connections statistical regularities embedded in the sensory inputs. On the other hand, it has been recently suggested that the hippocampus retrieves memory by generating predictions of ongoing sensory inputs. Computational models have thus been proposed to account for this predictive nature of the hippocampal network using predictive coding, a general theory of information processing in the cortex. However, these hierarchical predictive coding models of the hippocampus did not describe how it stores the statistical regularities that play a key role for associative memory in the classical hippocampal models, hindering a unified understanding of the underlying computational principles employed by the hippocampus. To address this dichotomy, here we present a family of predictive coding models that also learn the statistical information needed for associative memory. Our models can stably perform associative memory tasks in a biologically plausible manner, even with large structured data such as natural scenes. Our work provides a possible mechanism of how the recurrent hippocampal network may employ various computational principles concurrently to perform associative memory.Competing Interest StatementYS, BM and RB are shareholders in Neu Edge, which designs AI accelerator hardware.

Journal
PLOS Computational Biology
Month
April
Number
4
Pages
e1010719
Volume
19
Year
2023