To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chapters 6 and 7 deal with expansion calculi that exploit the fact that a design specification is ground confluent. First, directed expansion restricts paramodulation to left-to-right applications of prefix extensions of equational axioms (cf. Chapter 4). Narrowing (cf. Sect. 7.2) goes a step further and confines the input of expansion rules to pure axioms. Reductive expansion provides an alternative to inductive expansion, which originates from the idea of proving inductive validity by proving consistency (cf. Sect. 3.4) and reducing consistency to ground confluence (cf. Sects. 7.4 and 7.5).
Variables of a Horn clause occurring in its premise, but not in (the left-hand side of) its conclusion, called fresh variables, are usually banished as soon as one turns to a reduction calculus. This restriction cannot be maintained when arbitrary declarative programs are to be treated: if one follows the decomposition principle (cf. Sect. 2.6), then fresh variables are created automatically. They are then forbidden because they violate the usual condition that there is a Noetherian reduction ordering, i.e., that a reduction calculus admits only finite derivations. We shall see in Sect. 6.2 that other conditions on a reduction ordering can be weakened so as to often preserve the Noetherian property even if fresh variables are permitted.
This chapter is about setting up flat algebraic specifications. This involves the introduction of more COLD-K constructs and the formulation of various methodological guidelines. At the end of the previous chapter we had to conclude that we almost succeeded in specifying the natural numbers, the only problem being that the expressive power to express the minimality of Nat was lacking. This expressive power will be available after we have introduced the inductive predicate definitions below. We shall complete the example of the natural numbers and we shall investigate various technical aspects of inductive definitions – which unfortunately are quite non-trivial. In addition to inductive predicate definitions, we shall also have inductive function definitions. We address issues like ‘proof obligations’ for inductive definitions, consistency and completeness. Finally we give a number of complete examples of flat algebraic specifications: queues, stacks, bags and symbolic expressions.
Inductive predicate definitions
An inductive predicate definition defines a predicate as the least predicate satisfying some assertion (provided that this predicate exists). Before turning our attention to the syntactic machinery available in COLD-K for expressing this, we ought to explain this notion of ‘least’. Therefore we shall formulate what we mean by one predicate being ‘less than or equal to’ another predicate.
Definition. A predicate r is less than or equal to a predicate q if for each argument x we have that r(x) implies q(x).
We illustrate this definition by means of two unary predicates p1 and p2. We assume the sort Nat with its operations zero and succ as specified before.
The cut calculus for Horn clauses is simple, but rather inefficient as the basis of a theorem prover. To prove a goal γ via this calculus means to derive γ from axioms (those of the specification and congruence axioms for equality predicates) using CUT and SUB (cf. Sect. 1.2). In contrast, the inference rules resolution (cf. [Rob65]) and paramodulation (cf. [RW69]) allow us to start out from γ and apply axioms for transforming γ into the empty goal ∅. The actual purpose of resolution and paramodulation is to compute goal solutions (cf. Sect. 1.2): If γ can be transformed into ∅, then γ is solvable. The derivation process involves constructing a solution f, and ∅ indicates the validity of γ[f].
A single derivation step from γ to via resolution or paramodulation proves the clause γ[g]⇐δ for some g. Since γ[g] is the conclusion of a Horn clause, which, if viewed as a logic program, is expanded (into γ), we call such derivations expansions. More precisely, the rules are input resolution and paramodulation where one of the two clauses to be transformed stems from an “input” set of axioms or, in the case of inductive expansion (cf. Chapter 5), arbitrary lemmas or induction hypotheses.
While input resolution is always “solution complete”, input paramodulation has this property only if the input set includes all functionally-reflexive axioms, i.e., equations of the form Fx≡Fx (cf., e.g., [Hö189]).
EXPANDER is a proof support system for reasoning about data type specifications and declarative programs. EXPANDER applies the rules of inductive expansion (cf. Chapter 5) to correctness conditions that are given as single Gentzen clauses or sets of guarded Horn clauses (cf. Chapter 2). The system provides a kernel for special-purpose theorem provers, which are tailored to restricted application areas and implement specific proof plans, strategies or tactics. It is written in the functional language SML/NJ.
EXPANDER executes single inference steps. Each proof is a sequence of goal sets. The user has full control over the proof process. He may backtrack the sequence, interactively modify the underlying specification and add lemmas or induction orderings suggested by subgoals obtained so far. When a proof has been finished, the system can generate the theorems actually proved and, if necessary, the remaining subconjectures.
We first describe the kind of specifications that can be processed, then present the commands currently provided and, finally, document the implementation. The latter serves for illustrating the suitability of functional languages for encoding deductive methods.
The specifications
Specifications to be processed by EXPANDER are generated by the following context-free grammar in extended Backus-Naur form, i.e., [_], *, | denote the usual operators for building regular expressions. Key words are enclosed in “…”.
Part I introduced a collection of notations and techniques for algebraic specifications. These notations and techniques are relatively close to usual mathematics. As already shown by the examples of Part I, algebraic specifications suffice to describe a wide range of data types (Booleans, numbers, sets, bags, sequences, tuples, maps, stacks, queues, etc.) and they can even be used to describe syntax and semantics of languages, to describe rules and strategies of games and to describe many more non-trivial aspects of complex systems. Of course there are some differences between the notations and techniques of Part I and usual mathematics, like the restriction to first-order predicate logic with inductive definitions, the special way of treating partial functions and undefinedness and, most of all, the modularisation constructs. The latter difference reveals that COLD-K has its roots in software engineering and systems engineering, rather than in general purpose mathematics.
There is one more phenomenon which is characteristic for many branches of software engineering and systems engineering: special provisions for describing state-based systems.
How do we benefit from ground confluent specifications? Most of the advantages follow from Thm. 6.5: If (SIG, AX) is ground confluent, then and only then directed expansions yield all ground AX-solutions. Sects. 7.1 and 7.2 deal with refinements of directed expansion: strategic expansion and narrowing. Sect. 7.3 presents syntactic criteria for a set of terms to be a set of constructors (cf. Sect. 2.3). The results obtained in Sects. 7.2 and 7.3 provide the failure rule and the clash rule that check goals for unsolvability and thus help to shorten every kind of expansion proof (see the final remarks of Sect. 5.4).
Sect. 7.4 deals with the proof of a set CS of inductive theorems by showing that (SIG,AX∪CS) is consistent w.r.t. (SIG,AX) (cf. Sect. 3.4). Using consequences of the basic equivalence between consistency and inductive validity (Lemma 7.9) we come up with reductive expansion, which combines goal reduction and subreductive expansion (cf. Sect. 6.4) into a method for proving inductive theorems. While inductive expansion is always sound, the correctness of reductive expansion depends on ground confluence and strong termination of (SIG,AX). Under these conditions, an inductive expansion can always be turned into a reductive expansion (Thm 7.18). Conversely, a reductive expansion can be transformed in such a way that most of its “boundary conditions” hold true automatically (Thm. 7.19).
The chapter will close with a deduction-oriented concept for specification refinements, or algebraic implementations.
This book is about formal specification and design techniques, including both algebraic specifications and state-based specifications.
The construction and maintenance of complex software systems is a difficult task and although many software projects are started with great expectations and enthusiasm, it is too often the case that they fail to achieve their goals within the planned time and with the given resources. The software often contains errors; attempts to eliminate the errors give rise to new errors, and so on. Moreover, the extension and adaptation of the software to new tasks turns out to be a difficult and tedious task, which seems unsuitable for scientific methods.
This unsatisfactory situation can be improved by introducing precise specifications of the software and its constituent parts. When a piece of software P has a precise specification S say, then ‘P satisfies S’ is a clear statement that could be verified by reasoning or that could be falsified by testing; users of P can read S and rely on it and the designer of P has a clearly formulated task. When no precise specifications are available, there are hardly any clear statements at all, for what could one say: ‘it works’ or more often ‘it almost works’? Without precise specifications, it becomes very difficult to analyse the consequences of modifying P into P', for example, and to make any clear statements about that modification. Therefore it is worthwhile during the software development process to invest in constructing precise specifications of well-chosen parts of the software system under construction. Writing precise specifications turns out to be a considerable task itself.
The conception, construction, maintenance and usage of computer-based systems are difficult tasks requiring special care, skills, methods and tools. Program correctness is a serious issue and in addition to that, the size of the programs gives rise to problems of complexity management. Computers are powerful machines which can execute millions of instructions per second and manipulate millions of memory cells. The freedom offered by the machine to its programmer is large; often it is too large, in the sense that the machine does not enforce order and structure upon the programs. Computer-based systems are artificial systems and therefore there are no natural system partitionings and interface definitions. All structure is man-made and all interfaces must be agreed upon and communicated to all parties involved. The description and communication of system structures and interfaces turns out to be a non-trivial task and ‘specification languages’ have become an active area of research and development in computer science. When discussing ‘language’ we must distinguish explicitly between syntactic objects and semantic objects. Wittgenstein has expressed this idea as follows:
Der Satz stellt das Bestehen und Nichtbestehen der Sachverhalte dar,
i.e. the proposition represents the existence or non-existence of certain states of affairs. The propositions are syntactic objects and in this text we shall call them specifications. To describe a state of affairs concerning the natural world and concerning human interaction, natural language is the tool par excellence; to describe a state of affairs concerning computer-based systems, special languages are required in addition to that. The situation is typical: special restricted domains require special languages and this is also the case for the domain of computer-based systems.