FSCD 2017 accepted papers

Jasmin Christian Blanchette, Mathias Fleury and Dmitriy Traytel. Nested Multisets, Hereditary Multisets, and Syntactic Ordinals in Isabelle/HOL
Abstract: We present a collection of formalized results pertaining to (finite) nested multisets, developed using the Isabelle/HOL proof assistant. The nested multiset order is a generalization of the multiset order that can be used to prove termination of processes. Hereditary multisets, a variant of nested multisets, offer a convenient representation of ordinals below $\epsilon_0$. In Isabelle, both nested and hereditary multisets can be comfortably defined as inductive datatypes. Our formal library also provides, somewhat nonstandardly, multisets with negative multiplicities and ordinals with negative coefficients. We present some applications of the library to formalizations of Goodstein's theorem and the decidability of unary programming computable functions (PCF).
Benedetto Intrigila, Giulio Manzonetto and Andrew Polonsky. Refutation of Sallé's Longstanding Conjecture
Abstract: The lambda-calculus possesses a strong notion of extensionality, called "the omega-rule"', which has been the subject of many investigations. It is a longstanding open problem whether the equivalence obtained by closing the theory of Boehm trees under the omega-rule is strictly included in Morris's original observational theory, as conjectured by Sallé in the eighties.

In a recent work, Breuvart et al. have shown that Morris's theory satisfies the omega-rule. In this paper we demonstrate that the two aforementioned theories actually coincide, thus disproving Sallé's conjecture.
Andrew Pitts and Ian Orton. Models of Type Theory Based on Moore Paths
Abstract: This paper introduces a new family of models of intensional Martin-Löf type theory. We use constructive ordered algebra in toposes. Identity types in the models are given by a notion of Moore path. By considering a particular gros topos, we show that there is such a model that is non-truncated, i.e. contains non-trivial structure at all dimensions; in other words, in this model a type in a sequence of iterated identity types can contain more than one element, no matter how far along the sequence one goes. Although inspired by existing non-truncated models of type theory based on simplicial and on cubical sets, the notion of model presented here is notable for avoiding any form of Kan filling condition in the semantics of types.
Łukasz Czajka. Confluence of an extension of Combinatory Logic by Boolean constants.
Abstract: We show confluence of a conditional term rewriting system CL-pc1, which is an extension of Combinatory Logic by Boolean constants. This solves problem 15 from the RTA list of open problems. The proof has been fully formalized in the Coq proof assistant.
Noam Zeilberger. A sequent calculus for semi-associativity
Abstract: We introduce a sequent calculus with a simple restriction of Lambek's product rules that precisely captures the classical Tamari order, i.e., the partial order on fully-bracketed words (equivalently, binary trees) induced by a semi-associative law (equivalently, tree rotation). We establish a focusing property for this sequent calculus (a strengthening of cut-elimination), which yields the following coherence theorem: every valid entailment in the Tamari order has exactly one focused derivation. One combinatorial application of the coherence theorem is a new proof of the Tutte-Chapoton formula for the number of intervals in the Tamari lattice $Y_n$. Elsewhere, we have also used the sequent calculus and the coherence theorem to build a surprising bijection between intervals of the Tamari order and a certain natural fragment of linear lambda calculus, consisting of the $\beta$-normal planar lambda terms with no closed proper subterms.
Małgorzata Biernacka, Witold Charatonik and Klara Zielińska. Generalized Refocusing: from Hybrid Strategies to Abstract Machines
Abstract: We present a generalization of the refocusing procedure that provides a generic method for deriving an abstract machine from a specification of a reduction semantics satisfying simple initial conditions. The proposed generalization is applicable to a class of reduction semantics encoding hybrid strategies as well as uniform strategies handled by the original refocusing method. The resulting machine is proved to correctly trace (i.e., bisimulate in smaller steps) the input reduction semantics. The procedure and the correctness proofs have been formalized in the Coq proof assistant.
Lutz Strassburger. Combinatorial Flows and their Normalisation
Abstract: This paper introduces combinatorial flows that generalize combinatorial proofs such that they also include cut and substitution as methods of proof compression. We show a normalization procedure for combinatorial flows, and how syntactic proofs are translated into combinatorial flows and vice versa.
Marcelo Fiore and Philip Saville. List Objects with Algebraic Structure
Abstract: We introduce and study the notion of list object with algebraic structure. The first key aspect of our development is that the notion of list object is considered in the context of monoidal structure; the second key aspect is that we further equip list objects with algebraic structure in this setting. Within our framework, we observe that list objects give rise to free monoids and further show that this remains so in the presence of algebraic structure. Overall, thus, we provide a basic theory for describing free monoids with suitably compatible algebraic structure. This theory is accompanied with the study of two technical themes that, besides being of interest in their own right, are important for establishing applications. These themes are: parametrised initiality, central to the universal property defining list objects; and approaches to algebraic structure, in particular in the context of monoidal theories. The latter leads naturally to a notion of nsr (or near semiring) category of independent interest. With the theoretical development in place, we touch upon a variety of applications, considering Natural Numbers Objects in domain theory, giving a universal property for the monadic list transformer, providing free instances of algebraic extensions of the Haskell Monad type class, elucidating the algebraic character of the construction of opetopes in higher-dimensional algebra, and considering free models of second-order algebraic theories.
Ryota Suzuki, Koichi Fujima, Naoki Kobayashi and Takeshi Tsukada. Streett Automata Model Checking of Higher-Order Recursion Schemes
Abstract: We propose a practical algorithm for Streett automata model checking of higher-order recursion schemes (HORS), which checks whether the tree generated by a given HORS is accepted by a given Streett automaton. The Streett automata model checking of HORS is useful in the context of liveness verification of higher-order functional programs. The previous approach to Streett automata model checking converted Streett automata to parity automata and then invoked a parity tree automata model checker. We show through experiments that our direct approach outperforms the previous approach. Besides being able to directly deal with Streett automata, our algorithm is the first practical Streett or parity automata model checking algorithm that runs in time polynomial in the size of HORS, assuming that the other parameters are fixed. Previous fixed-parameter polynomial time algorithms for HORS could only deal with the class of trivial tree automata. We have confirmed through experiments that (a parity automata version of) our model checker outperforms previous parity automata model checkers for HORS.
Akitoshi Kawamura and Florian Steinberg. Polynomial running times for polynomial-time oracle machines
Abstract: This paper introduces a more restrictive notion of feasibility of functionals on Baire space than the established one from second-order complexity theory. Thereby making it possible to consider functions on the natural numbers as running times of oracle Turing machines, and avoiding second-order polynomials which are notoriously difficult to handle. Furthermore, all machines that witness this stronger kind of feasibility can be clocked and the different traditions of treating partial operators from computable analysis and second-order complexity theory are equated in a precise sense. The new notion is named "strong polynomial-time computability", and proven to be a strictly stronger requirement than polynomial-time computability. It is proven that within the framework for complexity of operators from analysis introduced by Kawamura and Cook the classes of strongly polynomial-time computable operators and polynomial-time computable operators coincide.
Andrej Dudenhefner and Jakob Rehof. The Complexity of Principal Inhabitation
Abstract: It is shown that in the simply typed lambda-calculus the following decision problem of principal inhabitation is Pspace-complete: Given a simple type tau, does it have a normal principal inhabitant? A normal principal inhabitant for tau is a lambda-term N in beta-normal form such that tau is the principal type of N. While a Ben-Yelles style algorithm was presented by Broda and Damas in 1999 for the problem of counting normal principal inhabitants (thereby answering a question posed by Hindley), the question of complexity of principal inhabitation was left open. Because the counting algorithm relies on checking principality of explicitly enumerated terms (which may be exponentially large), it does not induce a Pspace upper bound. Moreover, the standard construction of the Pspace lower bound for inhabitation in simple types does not carry over immediately, since the types used in the reduction are not guaranteed to be principally inhabited. We present a polynomial space bounded algorithm based on a characterization of principal inhabitation using path derivation systems over subformulae of the input type, which does not require candidate inhabitants to be explicitly constructed. The lower bound is shown by reducing a Pspace-hard restriction of simple type inhabitation to principal inhabitation.
Pablo Barenbaum and Eduardo Bonelli. Optimality and the Linear Substitution Calculus
Abstract: We lift the theory of optimal reduction to a decomposition of the lambda calculus known as the \emph{Linear Substitution Calculus} (\lsc). \lsc\ decomposes $\beta$-reduction into finer steps that manipulate substitutions in two distinctive ways: it uses \emph{context rules} that allow substitutions to act ``at a distance'' and rewrites modulo a set of \emph{equations} that allow substitutions to ``float'' in a term. We propose a notion of redex family obtained by adapting L\'evy labels to support these two distinctive features. This is followed by a proof of the finite family developments theorem (FFD). We then apply FFD prove an optimal reduction theorem for \lsc. Moreover, we apply FFD to deduce additional novel properties of \lsc, namely an algorithm for standardisation by selection and normalisation of a linear call-by-need reduction strategy. All results are proved in the axiomatic setting of Glauert and Khashidashvili's \emph{Deterministic Residual Structures}.
Pierr Vial and Delia Kesner. Types as Resources for Classical Natural Deduction
Abstract: We define resource aware typing systems for the λ μ -calculus based on non-idempotent intersection and union types. The non-idempotent approach provides very simple combinatorial arguments–based on decreasing measures of type derivations– to characterize head and strongly normalizing terms.
Benedikt Ahrens and Peter Lumsdaine. Displayed categories
Abstract: We introduce and develop the notion of displayed categories.

A displayed category over a category C is equivalent to "a category D and functor F : D -> C", but instead of having a single collection of "objects of D" with a map to the objects of C, the objects are given as a family indexed by objects of C, and similarly for the morphisms. This encapsulates a common way of building categories in practice, by starting with an existing category and adding extra data/properties to the objects and morphisms.

The interest of this seemingly trivial reformulation is that various properties of functors are more naturally defined as properties of the corresponding displayed categories. Fibrations, for example, when defined as certain functors, use equality on objects in their definition. When defined instead as a property of displayed categories, no reference to equality on objects is required. Moreover, almost all examples of fibrations in nature are, in fact, categories whose standard construction can be seen as going via displayed categories.

We therefore propose displayed categories as a basis for the development of fibrations in the type-theoretic setting, and similarly for various other notions whose classical definitions involve equality on objects.

Besides giving a conceptual clarification of such issues, displayed categories also provide a powerful tool in computer-formalisation, unifying and abstracting common constructions and proof-techniques of category theory. As such, most of the material of this article has been formalised in Coq over the UniMath library, with the aim of providing a practical library for use in further developments.
Yohji Akama. The confluent terminating context-free substitutive rewriting system for the lambda-calculus with surjective pairing and terminal type
Abstract: For the lambda-calculus with surjective pairing and terminal type, Curien and Di Cosmo introduced “Knuth-Bendix completion” of the naive rewriting system. Their system is a confluent (CR) context-free substitutive rewriting system. They left the strong normalization (SN) of their rewriting system open. By Girard reducibility method with restricting reducibility theorem, we prove SN of the following three systems: (1) their rewriting, (2) the extension by sum types and eta-like reductions, and (3) the extensions by polymorphism. We compare their system to context-sensitive expansions and syntactic Karoubi envelope.
Colin Riba and Pierre Pradic. A Curry-Howard Approach to Church’s Synthesis
Abstract: Church's synthesis problem asks whether there exists a finite-state stream transducer satisfying a given input-output specification. For specifications written in Monadic Second-Order Logic over infinite words, Church's synthesis can theoretically be solved algorithmically using automata and games. We revisit Church's synthesis via the Curry-Howard correspondance by introducing SMSO, a non-classical subtheory of MSO, which is shown to be sound and complete w.r.t. synthesis thanks to an automata-based realizability model.
Simon Castellan, Pierre Clairambault and Glynn Winskel. Observably Deterministic Concurrent Strategies and Intensional Full Abstraction for Parallel-or
Abstract: Although Plotkin's parallel-or is inherently deterministic, it has a non-deterministic interpretation in games based on (prime) event structures -- in which an event has a unique causal history -- because they do not directly support disjunctive causality. General event structures can express disjunctive causality and have a more permissive notion of determinism, but do not support hiding. We show that (structures equivalent to) deterministic general event structures do support hiding, and construct a new category of games based on them with a deterministic interpretation of maPCF, an affine variant of PCF extended with parallel-or. We then exploit this deterministic interpretation to give a relaxed notion of determinism (observable determinism) on the plain event structures model. Putting this together with our previously introduced concurrent notions of well-bracketing and innocence, we obtain an intensionally fully abstract model of maPCF.
Stefano Guerrini and Marco Solieri. Is the optimal implementation inefficient? Elementarily not
Abstract: Sharing graphs are a local and asynchronous implementation of lambda-calculus beta-reduction (or linear logic proof-net cut-elimination) that avoid useless duplications. Empirical benchmarks suggest that they are one of the most efficient machinery, when one wants to fully exploit the higher-order features of lambda-calculus. Although, we still lack confirming grounds with theoretical solidity to dispel uncertainties about the adoption of sharing graphs. Aiming at analysing in detail the worst-case overhead cost of sharing operators, we restrict to the case of elementary and light linear logic, two subsystems with bounded computational complexity of multiplicative exponential linear logic. In these two cases, the bookkeeping component is unnecessary, and sharing graphs are simplified to the so-called "abstract algorithm". By a modular cost comparison over a syntactical simulation, we prove that possible inefficiencies of the shared reduction are quadratically bounded with respect to cost of the naive implementation. This result generalises and strengthens a previous complexity result, and implies that the price of sharing is negligible, if compared to the obtainable benefits on reductions requiring a large amount of duplication.
Max Kanovich, Stepan Kuznetsov, Glyn Morrill and Andre Scedrov. A polynomial-time algorithm for the Lambek calculus with brackets of bounded order
Abstract: Lambek calculus is a logical foundation of categorial grammar, a linguistic paradigm of grammar as logic and parsing as deduction. Pentus (2010) gave a polynomial-time algorithm for determining provability of bounded depth formulas in L*, the Lambek calculus with empty antecedents allowed. Pentus' algorithm is based on tabularisation of proof nets. Lambek calculus with brackets is a conservative extension of Lambek calculus with bracket modalities, suitable for the modeling of syntactical domains. In this paper we give an algorithm for provability in Lb*, the Lambek calculus with brackets allowing empty antecedents. Our algorithm runs in polynomial time when both the formula depth and the bracket nesting depth are bounded. Our algorithm combines a Pentus-style tabularisation of proof nets, together with an automata-theoretic treatment of bracketing.
Paolo Pistone. Dinaturality between syntax and semantics
Abstract: We present two results which put dinaturality in between syntax (typability) and semantics (beta-eta-stable sets) and we combine them to obtain a completeness proof. First, we prove that closed dinatural lambda-terms are simply typable, that is, the converse of the well-known fact that simply typable closed terms are dinatural. The argument exposes a syntactical aspect of dinaturality, as lambda-terms are type-checked by computing their associated dinaturality equation. Second, we prove that a closed lambda-term belonging to all beta-eta-stable interpretations of a simple type must be dinatural, that is, we prove dinaturality by semantical means. To do this, we show that such terms satisfy a suitable version of binary parametricity and we derive dinaturality from it. By combining the two results we provide a new proof of the completeness of the beta-eta-stable semantics with respect to simple types (this fact was already established in 1998 by Farkh and Nour by a direct argument). However, the technique here developed suggests that dinaturality might be applied to prove completeness also for other, less manageable, semantics (like saturated families or reducibility candidates) for which a direct argument is not yet known.
Benjamin Lichtman and Jan Hoffmann. Arrays and References in Resource Aware ML
Abstract: This article introduces a technique to accurately perform static prediction of resource usage for ML-like functional programs with references and arrays. Previous research successfully integrated the potential method of amortized analysis with a standard type system to automatically derive parametric resource bounds. The analysis is naturally compositional and the resource consumption of functions can be abstracted using potential-annotated types. The soundness theorem of the analysis guarantees that the derived bounds are correct with respect to the resource usage defined by a cost semantics. Type inference can be efficiently automated using off-the-shelf LP solvers, even if the derived bounds are polynomials. However, side effects and aliasing of heap references make it notoriously difficult to derive bounds that depend on mutable structures, such as arrays and references. As a result, existing automatic amortized analysis systems cannot derive bounds for programs whose resource consumption depends on data in such structures. This article extends the potential method to handle mutable structures with minimal changes to the type rules while preserving the stated advantages of amortized analysis. To do so, we introduce a potential-annotated memory typing, which gathers all unique locations reachable from a reference. Apart from the design of the system, the main contribution is the proof of soundness for the extended analysis system.
Patrick Bahr. Böhm Reduction in Infinitary Term Graph Rewriting Systems
Abstract: In analogy to infinitary term rewriting, we investigate so-called Böhm extensions of infinitary term graph rewriting systems, i.e. the extension of term graph rewriting systems by additional rules that contract meaningless term graphs to a fresh constant ⊥. Our main result is that partial order convergence in a term graph rewriting system can be truthfully and faithfully simulated by metric convergence in the Böhm extension of the system. To prove this result we generalise the notion of residuals and projections to the setting of infinitary term graph rewriting. As ancillary results we prove the infinitary strip lemma and the compression property, both for partial order and metric convergence.
Jonas Kaiser, Brigitte Pientka and Gert Smolka. Relating System F and $\lambda$2: A Case Study in Coq, Abella and Beluga
Abstract: We give three formalisations of a proof of the equivalence of the usual, two-sorted presentation of System F and its single-sorted pure type system (PTS) variant $\lambda$2. This is established by reducing the typability problem of F to $\lambda$2 and vice versa. A key challenge is the treatment of variable binding and contextual information. The formalisations all share the same high level proof structure using relations to connect the type systems. They do, however, differ significantly in their representation and manipulation of variables and contextual information. In Coq, we use pure de Bruijn indices and parallel substitutions. In Abella, we use higher-order abstract syntax (HOAS) and nominal constants of the ambient reasoning logic. In Beluga, we also use HOAS but within contextual modal type theory. Our contribution is twofold. First, we present and compare a collection of machine-checked solutions to a non-trivial theoretical result. Second, we propose our proof as a benchmark, complementing the POPLmark and ORBI challenges by testing how well a given proof assistant or framework handles complex contextual information involving multiple type systems.
Nao Hirokawa, Aart Middeldorp, Christian Sternagel and Sarah Winkler. Infinite Runs in Abstract Completion
Abstract: Completion is one of the first and most studied techniques in term rewriting and fundamental to automated reasoning with equalities. In an earlier paper we presented a new and formalized correctness proof of abstract completion for finite runs. In this paper we extend our analysis and our formalization to infinite runs, resulting in a new proof that fair infinite runs produce complete presentations of the initial equations. We further consider ordered completion -- an important extension of completion that aims to produce ground-complete presentations of the initial equations. Moreover, we revisit and extend results of Métivier concerning canonicity of rewrite systems. All proofs presented in the paper have been formalized in Isabelle/HOL.
Robin Cockett and Jean-Simon Lemay. There is only one notion of differentiation
Abstract: In an influential paper, published in 2007, Marcelo Fiore introduced a creation operator to provide a notion of differentiation for liner logic and claimed that it was novel and distinct from the earlier notion of a deriving transformation. This paper analyzes the relationship between the notions of differentiation provided by a codereliction transformation and a deriving transformation. We show that for any monoidial coalgebra modality the two notions are equivalent. This shows that a creation operator, contrary to Fiore's claim, is completely equivalent to a deriving transformation.

This suggests the view that there is only one notion of differentiation but, as one varies the setting, it admits rather different presentations.
Takahito Aoto, Yoshihito Toyama and Yuta Kimura. Improving Rewriting Induction Approach for Proving Ground Confluence
Abstract: In (Aoto&Toyama, FSCD 2016), a method to prove ground confluence of many-sorted term rewriting systems based on rewriting induction is given. In this paper, we give several methods that add wider flexibility to the rewriting induction approach for proving ground confluence. Firstly, we give a method to deal with the case in which suitable rules are not presented in the input system. Our idea is to construct additional rewrite rules that supplement or replace existing rules in order to obtain a set of rules that is adequate for applying rewriting induction. Secondly, we give a method to deal with non-orientable constructor rules. This is accomplished by extending the inference system of rewriting induction and giving a sufficient criterion for the correctness of the system. Thirdly, we give a method to deal with disproving ground confluence. The presented methods are implemented in our ground confluence prover AGCP and experiments are reported. Our experiments reveal the presented methods are effective to deal with problems for which state-of-the-art ground confluence provers can not handle.
Daniel R. Licata, Michael Shulman and Mitchell Riley. A Fibrational Framework for Substructural and Modal Logics
Abstract: Many intuitionistic substructural and modal logics / type theories can be seen as a restriction on the allowed proofs in a standard structural logic / lambda-calculus. For example, substructural logics remove properties such as associativity, weakening, exchange, and contraction, while modal logics place restrictions on the positions in which certain variables can be used. These restrictions are usually enforced by a specific context structure (trees,lists,multisets,sets,dual zones,...) that products, implications, and modalities manipulate. While the general technique is clear, it can be difficult to devise rules modeling a new situation, a problem we have recently run into while extending homotopy type theory to express additional mathematical applications.

In this paper, we define a general framework that abstracts the common features of many intuitionistic substructural and modal logics. The framework is a sequent calculus / normal-form type theory parametrized by a _mode theory_, which is used to describe the structure of contexts and the structural properties they obey. The framework makes use of resource annotations, where we pair the context itself, which obeys standard structural properties, with a term, drawn from the mode theory, that constrains how the context can be used. Product types, implications, and modalities are defined as instances of two general connectives, one positive and one negative, that manipulate these resource annotations. We show that specific mode theories can express non-associative, ordered, linear, affine, relevant, and cartesian products and implications; monoidal and non-monoidal comonads and adjunctions; strong and non-strong monads; n-linear variables; bunched implications; and the adjunctions that arose in our work on homotopy type theory. We prove cut (and identity) admissibility independently of the mode theory, obtaining it for all of the above logics at once. Further, we give a general equational theory on derivations / terms that, in addition to the usual beta-eta rules, characterizes when two derivations differ only by the placement of structural rules. Finally, we give an equivalent semantic presentation of these ideas, in which a mode theory corresponds to a 2-dimensional cartesian multicategory, and the framework corresponds to another such multicategory with a functor to the mode theory. The logical connectives have universal properties relative to this functor, making it into a bifibration. The sequent calculus rules and the equational theory on derivations are sound and complete for this. The resulting framework can be used both to understand existing logics / type theories and to design new ones.
Tadeusz Litak, Miriam Polzer and Ulrich Rabenstein. Negative Translations and Normal Modality
Abstract: We discuss the behaviour of variants of standard negative translations---Kolmogorov, Goedel-Gentzen, Kuroda and Glivenko---in extensions of the propositional calculus with an unary normal modality and additional axioms. More specifically, we address the question whether negative translations as a rule embed faithfully a classical modal logic into its intuitionistic counterpart. As it turns out, even the Kolmogorov translation can go wrong with rather natural modal principles. Nevertheless, we isolate sufficient syntactic criteria ensuring adequacy of well-behaved (or, in our terminology, "regular") translations for logics axiomatized with formulas satisfying these criteria, which we call "enveloped implications". Furthermore, a large class of computationally relevant modal logics---namely, logics of type inhabitation for applicative functors (a.k.a. "idioms")---turns out to validate the modal counterpart of the Double Negation Shift, thus ensuring adequacy of even the Glivenko translation. All our positive results are proved purely syntactically, using the minimal natural deduction system of Bellin, de Paiva and Ritter extended with additional axioms/combinators, hence also allowing a direct formalization in a proof assistant (in our case Coq).
Daniel Hillerström, Sam Lindley, Robert Atkey and Kc Sivaramakrishnan. Continuation Passing Style for Effect Handlers
Abstract: We study Continuation Passing Style (CPS) translations for Plotkin and Pretnar's effect handlers based on Hillerström and Lindley's row-typed fine-grain call-by-value calculus of effect handlers. We begin with a first-order CPS translation into untyped lambda calculus which manages a stack of continuations and handlers as a curried sequence of arguments.

We refine the initial CPS translation first by uncurrying it to yield a properly tail-recursive translation and second by making it higher-order in order to contract administrative redexes at translation time. We prove that the higher-order CPS translation simulates effect handler reduction. We have implemented the higher-order CPS translation as a backend for the Links web programming language.