Quantitative Verification: From Model Checking to Model Measuring
Enabling engineers, programmers, and researchers to automatically verify the correctness of the computer systems that they design is one of the Grand Challenges of computing research. The scientific and economic importance of this goal has long been recognised, yet, despite substantial progress in basic and applied research over the last few decades, much work remains ahead of us. With this Fellowship I would like to address the issue of quantitative verification and analysis of real-time and probabilistic systems, from the development of novel and fundamental algorithms all the way to the design and implementation of tools.
The field of computer-aided formal verification is concerned, broadly speaking, with ensuring that computer systems function as they were intended to. In any instances, the correctness of such systems is critical, with failures potentially incurring disastrous financial consequences or even loss of human life. Well-known catastrophic failures in recent years include the 1996 explosion of the Ariane 5 rocket on its maiden flight, the 1999 loss of the MarsPolarLander as it attempted touch-down, and the 2003 Northeast power blackout, costing several billion dollars. In each case, the culprit was a bug in the software, which investigators subsequently discovered could have been rooted out by using computer-aided verification technology.
One of the key approaches to verification is model checking, pioneered nearly three decades ago by Clarke, Emerson, Queille, Sifakis, and Pnueli, for which Clarke, Emerson, and Sifakis won the 2007 Turing Award and Pnueli the 1996 Turing Award. In this framework, a finite-state model of the system under consideration is constructed, usually in an automated or semi-automated manner, and its specifications are expressed using formulas in a suitable temporal logic. Model checking then involves determining whether the model satisfies the formulas, which is achieved using a sophisticated mix of abstraction techniques, symbolic representations, and graph-theoretic algorithms. The fundamental challenge in this approach is the state space explosion problem, whereby the state space of the systems to be explored can easily turn out to be astronomical. Model checking has nonetheless proved hugely valuable to the hardware industry, and more recently has become increasingly successful in software development. Industrial users of model checking include Intel, Motorola, Microsoft, Airbus, NASA, and many others.
While traditional model checking has focussed on qualitative representations and properties of systems, there is an urgent need for quantitative verification technology, and specifically for frameworks able to accurately handle real-time and probabilistic systems. Such systems are ubiquitous, occurring in cars, aeroplanes, railway controllers, power plants, medical equipment, the Internet, etc. Research in quantitative verification has made great strides over the last two decades, yet many fundamental challenges remain. Building on recent breakthroughs in real-time and probabilistic model checking, I believe we are now ideally poised to tackle many of these challenges.