Skip to main content

Techniques to solve computationally hard problems in automata theory

Richard Mayr ( University of Edinburgh (currently on sabbatical at Oxford) )
Consider nondeterministic finite automata (NFA) and nondeterministic Buchi automata (NBA), which describe regular languages and omega-regular languages, respectively. Central problems for these are computationally hard (PSPACE-complete), e.g., checking language inclusion, equivalence and universality, as well as finding automata of minimal size for a given language.

In spite of the high worst-case complexity, recent algorithms and software-tools can solve many instances of nontrivial size. Here we give an overview over these techniques. They include antichain techniques exploiting logical subsumption, the construction of congruence bases, and automata minimization methods based on transition pruning and state-space quotienting w.r.t. generalized simulation preorders. In particular, multipebble simulations and the more practical lookahead simulations can in polynomial time compute very good under-approximations of the PSPACE-complete relations of (trace-)language inclusion. A collection of current papers and tools is available at www.languageinclusion.org.

Selected references:

Lorenzo Clemente and Richard Mayr. Advanced Automata Minimization. POPL 2013. http://arxiv.org/abs/1210.6624

Richard Mayr, Parosh Aziz Abdulla, Yu-Fang Chen, Lorenzo Clemente, Lukás Holik, Chih-Duo Hong and Tomas Vojnar. Advanced Ramsey-based Buchi Automata Inclusion Testing. CONCUR 2011, volume 6901 in LNCS. 2011.

F. Bonchi and D. Pous. Checking NFA equivalence with bisimulations up to congruence. POPL 2013.

P.A. Abdulla, Y.-F. Chen, L. Holik, R. Mayr, and T. Vojnar. When Simulation Meets Antichains (On Checking Language Inclusion of NFAs). In Proc. of 16th International Conference on Tools and Algorithms for the Construction and Analysis of Systems—TACAS'10, Paphos, Cyprus, volume 6015 of LNCS, pages 158–174, 2010.

 

 

Share this: