We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We shall give only an outline of the proof of this theorem because the proof does not present any difficulty in principle and is rather long.
Kurt Gödel [Göd67b]
The climactic step in Gödel's proof of the incompleteness theorem is the construction of a sentence asserting its own unprovability. Since provability is a metatheoretic notion, the construction of such a sentence within Z2 requires that some part of the metatheory of Z2 be represented within Z2 itself. The main result of this chapter is an outline of the proof of the representability of the metatheory of Z2 (as described in Chapter 2) within Z2. The formalization of the representability theorem is perhaps the single most substantial and difficult part of our mechanical verification. The machinery of derived inference rules developed in Chapter 3 is heavily exploited in the representability proofs. The representability theorem is used in the construction of the undecidable sentence in Chapter 5.
The contents of this and the succeeding chapter are quite complicated since they define and use a seemingly unmanageable variety of encodings. One encoding represents the syntax of Z2 expressions using only numbers and the pairing operation of Lisp, and another encoding represents these latter Lisp data structures as sets in Z2. The representability theorem then shows the correspondence (under the encoding) between the metatheoretic Lisp computations and certain kinds of proofs in Z2.
A Brief Overview
We saw in Chapter 2 how the metatheory of Z2 could be formalized in terms of Lisp functions such as Collect-Free, Subst, Prf, and Proves.
It is well known that the development of mathematics in the direction of greater precision has led to the formalization of xtensive mathematical domains, in the sense that proofs can be carried out according to a few mechanical rules.
Kurt Gödel [Göd67b]
Thus began Gödel's 1931 paper [Göd67b] in which he demonstrated the existence of formally undecidable sentences in a wide class of formal systems. The first part of this book describes a formalization and proof of this theorem that was carried out according to a “few mechanical rules.” The proof here is not a direct mechanization of any particular previously published version of the incompleteness proof. The statement of the theorem differs from Gödel's original statement which involved the notion of w-consistency. The statement here only involves the weaker notion of consistency and this form of the theorem was first proved by Rosser [Ros36]. The theorem we establish asserts the incompleteness of cohen's Z2 [Coh66]. The first-order logic of Z2 is taken from that of Shoenfield [Sho67]. Various metatheorems about Z2 are formalized and proved using the Boyer–Moore theorem prover.
This chapter presents a complete description of the formal statement of the incompleteness theorem. The main part of this description is the definition of the metatheory of Z2 in terms of a Lisp representation for the formulas and proofs of Z2, and a Lisp predicate that checks the validity of Z2 proofs represented in this manner. The representation of the symbols (variables, functions, predicates) and expressions (terms, atomic formulas, negation, disjunction, quantification) is first described. Various operations are defined for checking the well-formedness of expressions along with the important operation of substitution.
A: Oh, it can't be proved, but nevertheless it's true.
B: Now just a minute: How can you say it's true if it can't be proved?
A: Oh, there are certain things that are true even though they can't be proved.
B: That's not true!
A: Yes, it is; Gödel proved that there are certain things that are true but that cannot be proved.
B: That's not true!
A: It certainly is!
B: It couldn't be, and even if it were true, it could never be proved!
Goodwin Sammel quoted by Raymond Smullyan [Smu83]
Electronic computers are mostly used to automate clerical tasks, but it is well known that they can also be programmed to play chess, compose music, and prove mathematical theorems. Since logical, mathematical reasoning is one of the purer forms of human, intellectual thought, the automation of such reasoning by means of electronic computers is a basic and challenging scientific problem. This book is about how computers can be used to construct and check mathematical proofs. This use of computers relies on their symbolic and deductive capabilities and is radically different from their use as fast numerical calculators. The impact of computers as intellectual, rather than clerical, tools is likely to be quite profound. This book tries to demonstrate in a concrete way that the capabilities of computing machines in this regard are quite powerful. It describes proofs of some significant theorems of mathematical logic that were completely checked by means of a computer.
This chapter explains the concept of a structure as a collection of variables, this collection including nested structures if desired.
A structure can be handled in much the same way as a variable; you can copy a structure, assign to a structure, take the address of a structure with &, access the members of a structure. You may declare arrays of structures. You can nominate a structure as a parameter of a function or write a function that returns a structure.
This chapter introduces structures by analogy with arrays. The operators for accessing members are defined and their use explained Concepts are illustrated by an example of a library list in which to search for a book if you know its title or can remember only part of its title.
Unions and bitfields are introduced (a union is a structure in which members share storage space).
Having described structures and unions it is possible to define, fully, the syntax terms type and declaration. The allowable syntax of declaration differs according to context, so a separate diagram is drawn for each context.
Finally the chapter explains the idea of stacks and gives an example of their use in converting simple algebraic expressions to reverse Polish notation.
INTRODUCING STRUCTURES
Much of information handling is about updating and sorting lists of names and addresses. With each name and address may come other information: an amount of money owing, a list of diseases survived, a code indicating the subject's purchasing power or likelihood of signing an order.
This chapter explains the shortcomings of arrays having pre-determined length and the consequent need for dynamic storage. The use of library functions for allocating and freeing memory is then explained.
The concept of a linked list is introduced with particular reference to stacks. The concept is then extended to cover rings and binary trees.
Examples in this chapter include a program for demonstrating ring structures and a program for finding the shortest route through a road network. Finally there is a sorting program based on the monkey puzzle technique.
MEMORY ALLOCATION
The trouble with arrays is that the processor has to be told what space to allocate to each of them before execution starts. There are many applications for which it is impossible to know the detailed memory requirements in advance. What we need is a scheme in which the processor allocates memory during execution and on demand. A program written to use such a scheme fails only if demands on total memory exceed supply, not if one of many individual arrays exceeds its bounds.
The scheme is called ‘dynamic storage.’ For practical purposes it is based on structures. When the program needs a new structure it creates one from a ‘heap’ of unstructured memory. When a program has finished with a structure it frees the memory from which that structure was built, tossing it back on the heap.
This is probably the most important chapter in the book; the art of C is handling pointers. Pointers are closely associated with arrays, and arrays with strings.
The chapter begins by explaining the concept of a pointer and defines two operators, * and &, with which to declare and manipulate pointers.
Because C works on the principle of ‘call by value’ you cannot return values from functions by altering the values stored in their parameters. But you can use pointers as parameters and make functions alter the contents of the objects they point to. This concept may appear tricky at first, but glorious when you can handle it confidently. The chapter spends some time on this concept.
When you add 2 to a pointer into an array, the pointer then points to the element two further along, regardless of the length of element. This is a property of pointer arithmetic, the subject next described in this chapter.
Most pointers point to objects, but you can make them point to functions as well. The chapter shows the correspondence between pointers to arrays and pointers to functions. You may care to skip this topic on first reading; likewise the next which analyses the structure of complex declarations. Complex declarations are easy to understand once you have felt the need to set up a data structure in which pointers point to other pointers.
This chapter defines most of the basic components of C. Their syntax is defined using a pictorial notation. Characters, names and constants (the simple building blocks) are defined first. Important principles of the language are next explained; these include the concept of scalar ‘types’, the precedence and associativity of ‘operators’, the concepts of ‘coercion’ and ‘promotion’ in expressions of mixed type.
The operators are summarized on a single page for ease or reference.
The syntax of expressions and statements is defined in this chapter. Declarations are discussed, but their syntax is not defined because it involves the concept of pointers and dynamic storage. These topics are left to later chapters.
NOTATION
For a precise definition of the syntax of ANSI C, see the definitions in NSI X 3.159. These are expressed in BNF (Backus Naur Form).
To appreciate the syntactical form of an entity the practical programmer needs something different; BNF is not a self evident notation. Some books employ railway track diagrams, potentially easier to comprehend than BNF, but the tracks grow too complicated for defining structures in C. So I have devised a pictorial notation from which a programmer should be able to appreciate syntactical forms at a glance. The notation is fairly rigorous but needs a little help from notes here and there.
The original C programming language was devised by Dennis Ritchie. The first book on C, by Kernighan and Ritchie, came out in 1978 and remained the most authoritative and best book on the subject until their second edition, describing ANSI standard C, appeared in 1988. In all that time, and since, the availability and use of C has increased exponentially. It is now one of the most widely used programming languages, not only for writing computer systems but also for developing applications.
There are many books on C but not so many on ANSI standard C which is the version described here.
This book attempts three things:
to serve as a text book for introductory courses on C aimed both at those who already know a computer language and at those entirely new to computing
to summarize and present the syntax and grammar of C by diagrams and tables, making this a useful reference book on C
to illustrate a few essential programming techniques such as symbol state tables, linked lists, binary trees, doubly linked rings, manipulation of strings, parsing of algebraic expressions.
For a formal appreciation of C – its power, its advantages and disadvantages – see the references given in the Bibliography. As an informal appreciation: all those I know who program in C find the language likeable and enjoy its power. Programming C is like driving a fast and powerful car. Having learned to handle the car safely you would not willingly return to the family saloon.