A proposal for a new foundation for proof theory
We do not understand proofs as much as would be necessary to answer simple questions, such as: when are two given proofs the same? There seems to be consensus that developing some sort of topological or geometric representation of proofs is the way to proceed, and I agree with this view.
In this talk I will illustrate the deep inference research programme, which allows us to introduce topology in proof theory by bounding to a constant the complexity of inference steps. By making inference steps extremely small we achieve unprecedented freedom in the design of proof systems and in the manipulation of proofs. In particular we obtain small canonical proofs, which seems to be the key to a successful proof semantics.
All this requires a simple but fundamental change to the way proofs are built compared to the standard Gentzen theory: proofs have to be freely composed by whatever logical relations are available in the language for which we want a proof theory - this is indeed the definition of deep inference. In Gentzen theory composition is not free in the sense above, and this limitation is the source of much evil. Deep inference has a cost: the whole normalisation theory has to be rebuilt from scratch. Luckily, many of us have done this already in the past 15 years, and we now understand the normalisation theory of deep inference pretty well.
The talk will not require a special expertise in proof theory, and I will illustrate the above ideas with two examples: classical logic and a new, exotic logic called BV. We are currently developing BV in order to model self-dual non-commutativity, which seems to yield a good notion of time in process algebras and perhaps even in physics.