# Entropies on test spaces

Jonathan Barrett ( University of Bristol )

- 14:00 29th January 2010 ( week 2, Hilary Term 2010 )Lecture Theater B

We have known for a long time how to construct models which are more general than the classical and quantum theories, but which include both of these as special cases. More recently, following the success of quantum information theory, there has been quite a lot of work investigating information processing in these alternative models. The aim is to understand quantum theory better: Are there, for example, alternative models which are much more powerful than quantum theory for some task or orther? If so, is there a deep reason why Nature would choose quantum theory rather than the alternatives, thus denying us this power?

In my talk I will describe some particular results that lie within this programme. Using the Randall-Foulis notion of a "test space", I will introduce several different definitions of the entropy of a state. The classical and quantum theories are remarkable in that these definitions coincide, giving the Shannon and von Neumann entropies. I will show that in general they do not coincide, and that the assumption that they do is a strong constraint on theories. If time I will explain a connection with Pawlowski et al's results on "information causality", and speculate about whether general models admit a thermodynamic entropy.

See arXiv:0909.5075

In my talk I will describe some particular results that lie within this programme. Using the Randall-Foulis notion of a "test space", I will introduce several different definitions of the entropy of a state. The classical and quantum theories are remarkable in that these definitions coincide, giving the Shannon and von Neumann entropies. I will show that in general they do not coincide, and that the assumption that they do is a strong constraint on theories. If time I will explain a connection with Pawlowski et al's results on "information causality", and speculate about whether general models admit a thermodynamic entropy.

See arXiv:0909.5075