Skip to main content

Predictive Coding Beyond Gaussian Distributions

Luca Pinchetti‚ Tommaso Salvatori‚ Yordan Yordanov‚ Beren Millidge‚ Yuhang Song and Thomas Lukasiewicz

Abstract

A large amount of recent research has the far-reaching goal of finding training methods for deep neural networks that can serve as alternatives to backpropagation (BP). A prominent example is predictive coding (PC), which is a neuroscience-inspired method that performs inference on hierarchical Gaussian generative models. These methods, however, fail to keep up with modern neural networks, as they are unable to replicate the dynamics of complex layers and activation functions. In this work, we solve this problem by generalizing PC to arbitrary probability distributions, enabling the training of architectures, such as transformers, that are hard to approximate with only Gaussian assumptions. We perform three experimental analyses. First, we study the gap between our method and the standard formulation of PC on multiple toy examples. Second, we test the reconstruction quality on variational autoencoders, where our method reaches the same reconstruction quality as BP. Third, we show that our method allows us to train transformer networks and achieve performance comparable with BP on conditional language models. More broadly, this method allows neuroscience-inspired learning to be applied to multiple domains, since the internal distributions can be flexibly adapted to the data, tasks, and architectures used.

Book Title
Proceedings of the 36th Annual Conference on Neural Information Processing Systems‚ NeurIPS 2022‚ New Orleans‚ Louisiana‚ USA
Month
November
Pages
1280–1293
Year
2022