A Game based Abstraction Refinement Framework for Markov Decision Processes
Joint work with Mark Kattenbelt, Gethin Norman and David Parker.
Probabilistic model checking is an algorithmic method for establishing the correctness of probabilistic systems, for example randomised algorithms or software that exhibits failures, against specifications given in probabilistic temporal logic. In the field of model checking, abstraction-refinement frameworks have been introduced to automate the process of computing abstractions that are precise enough to validate or refute a given property. In this talk, we present an abstraction-refinement framework for Markov decision processes (MDPs), a widely used model of probabilistic distributed computation. Our framework comprises an abstraction approach based on stochastic two-player games and an efficient algorithm for the abstraction-refinement loop. The key idea behind the abstraction approach is to maintain a separation between nondeterminism present in the original MDP and nondeterminism introduced during the abstraction process, each type being represented by a different player in the game. Crucially, this allows lower and upper bounds to be computed for the values of probabilistic reachability properties of the original MDP. These give a quantitative measure of the quality of the abstraction and form the basis of the corresponding refinement. We apply these techniques in conjunction with SAT-based abstraction to implement a verifier for ANSI-C programs that exhibit probabilistic behaviour. Our techniques target quantitative properties such as "the maximum probability of file-transfer failure" or "the minimum expected number of loop iterations" and have been applied to real networking software.