To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Previous chapters have shown that formal dialogue models of argumentation, along with tools from AI like the Carneades Argumentation System and the abstract argumentation framework, apply very well to modeling burden of proof and presumption in the well-organized, rule-directed framework of a legal trial. The objection posed in Chapter 1 was that the legal concepts of burden of proof and presumption have been illicitly transferred from the legal setting to public policy discussions and other arenas where the argumentation is not structured in the same way it is in a legal setting. Chapter 8 takes up this challenge by arguing that the formal dialectical framework of Chapter 4 can be usefully applied to modeling burden of proof and presumption in these other settings. This argument, of course, does not claim the burden of proof and presumption work in every respect in the same way they work in other settings. It only means that the methods used to reason about evidence in the common law system presents an outline of reasonable but defeasible argumentation that has some important features, represented in the dialogue models of burden of proof and presumptions presented in the previous chapters of this book, and these features can be adapted to and applied in a helpful manner to other settings of argumentation outside law.
Of course there are many different settings in which argumentation is used as a means of proving a hypothesis, settling a conflict of opinions based on evidence brought forward or arriving at a rational decision on how to make a choice in a deliberation on what to do in a situation requiring such a choice, as indicated in Chapter 7. For the purposes of this book, however, there are two main settings that seem to be of main interest in relation to the issue of transferability of the legal notions of presumption and burden of proof. One is the kind of organized disputation of the kind represented in a forensic debate, or to cite a specific example, a presidential debate that has a moderator and is televised to a large audience.
A presumption is a device used in the law of evidence to enable a proposition to be taken into account as a piece of evidence in a case even though the argument supporting that proposition is not strong enough for it to meet a required burden of proof. From this definition of what presumption is, we can already see that presumption is linked to burden of proof in evidential reasoning in law. Burden of proof sets a standard for what is to be considered a proof in evidential reasoning in law. It is a device used to make it possible for a trial to arrive at a decision for one side or another in a contested case, even though all the facts of the case may not be known, and for various reasons may never be known. For example, in a criminal case, there may have been no witnesses to the crime, and the crime may have happened a long time ago. Most of the existing evidence may have been lost or destroyed. Therefore, evidential reasoning in law has to be able to move forward to a conclusion under conditions of uncertainty, lack of knowledge and even inconsistency. Typically, for example, in a trial there will be witnesses for one side, but there will also be conflicting testimony on the other side brought in by witnesses who say the opposite thing. What these conditions imply is that in a trial it is rarely if ever possible to prove or disprove the ultimate conclusion beyond all doubt. Hence, the device of having a burden of proof is necessary for the trial to reach a conclusion for one side or the other.
Presumption is not a new notion in legal reasoning. It was a device used in the ancient Jewish law code of the Talmud, and in ancient Roman law. A rough idea of how presumptions work is shown by citing some of the more common examples. According to the presumption of death, a person who has been unheard of for a fixed period of time, varying with the jurisdiction, five years typically in common law, may be presumed to be dead if there is no other explanation of his or her disappearance based on any evidence. Later in this book we will examine an example of another interesting kind of presumption called the presumption of mailing, which presumes that a properly addressed and stamped letter sent by the Postal Service was received by the person to whom it was addressed.
Most of the literature on burden of proof in argumentation studies and AI has concentrated so far on the persuasion type of dialogue. This concentration is natural enough, because the bulk of this literature has concentrated on burden of proof in legal argumentation. The most significant exception is deliberation dialogue, where some recent work has begun to tentatively investigate burden of proof in that setting. The problem now posed is whether burden of proof operates in deliberation dialogue in the same way that it operates in persuasion dialogue, or whether there are essential differences in this regard between the two types of dialogue.
This chapter analyzes four examples of deliberation dialogue where burden of proof poses a problem. Based on analysis of the argumentation in these examples, a working hypothesis is put forward. It is that burden of proof only becomes relevant when deliberation dialogue shifts, at the beginning of the argumentation stage, to a persuasion dialogue. The hypothesis is that the shift can be classified as embedding one type of dialogue into another, meaning that the goal of the first type of dialogue continues to be supported once the transition to the second type of dialogue has been made (Walton and Krabbe, 1995, 102). In other instances, it is well known that a shift can be illicit, where the advent of the second dialogue interferes with the fulfillment of the goal of the first one. It has also been shown that such shifts can be associated with fallacies, as well as other logical and communicative problems (Walton, 2007, chapter 6).
This book analyzes the uses of emotive language and redefinitions from pragmatic, dialectical, epistemic and rhetorical perspectives, investigating the relationship between emotions, persuasion and meaning, and focusing on the implicit dimension of the use of a word and its dialectical effects. It offers a method for evaluating the persuasive and manipulative uses of emotive language in ordinary and political discourse. Through the analysis of political speeches (including President Obama's Nobel Peace Prize address) and legal arguments, the book offers a systematic study of emotive language in argumentation, rhetoric, communication, political science and public speaking.
Some of our earliest experiences of the conclusive force of an argument come from school mathematics: faced with a mathematical proof, we cannot deny the conclusion once the premises have been accepted. Behind such arguments lies a more general pattern of 'demonstrative arguments' that is studied in the science of logic. Logical reasoning is applied at all levels, from everyday life to advanced sciences, and a remarkable level of complexity is achieved in everyday logical reasoning, even if the principles behind it remain intuitive. Jan von Plato provides an accessible but rigorous introduction to an important aspect of contemporary logic: its deductive machinery. He shows that when the forms of logical reasoning are analysed, it turns out that a limited set of first principles can represent any logical argument. His book will be valuable for students of logic, mathematics and computer science.
This book on modal logic is especially designed for philosophy students. It provides an accessible yet technically sound treatment of modal logic and its philosophical applications. Every effort is made to simplify the presentation by using diagrams instead of more complex mathematical apparatus. These and other innovations provide philosophers with easy access to a rich variety of topics in modal logic, including a full coverage of quantified modal logic, non-rigid designators, definite descriptions, and the de-re de-dicto distinction. Discussion of philosophical issues concerning the development of modal logic is woven into the text. The book uses natural deduction systems, which are widely regarded as the easiest to teach and use. It also includes a diagram technique that extends the method of truth trees to modal logic. This provides a foundation for a novel method for showing completeness that is easy to extend to quantifiers. This second edition contains a new chapter on logics of conditionals, an updated and expanded bibliography, and is updated throughout.
Argumentation, which can be abstractly defined as the interaction of different arguments for and against some conclusion, is an important skill to learn for everyday life, law, science, politics and business. The best way to learn it is to try it out on real instances of arguments found in everyday conversational exchanges and legal argumentation. The introductory chapter of this book gives a clear general idea of what the methods of argumentation are and how they work as tools that can be used to analyze arguments. Each subsequent chapter then applies these methods to a leading problem of argumentation. Today the field of computing has embraced argumentation as a paradigm for research in artificial intelligence and multi-agent systems. Another purpose of this book is to present and refine tools and techniques from computing as components of the methods that can be handily used by scholars in other fields.
What do the rules of logic say about the meanings of the symbols they govern? In this book, James W. Garson examines the inferential behaviour of logical connectives (such as 'and', 'or', 'not' and 'if … then'), whose behaviour is defined by strict rules, and proves definitive results concerning exactly what those rules express about connective truth conditions. He explores the ways in which, depending on circumstances, a system of rules may provide no interpretation of a connective at all, or the interpretation we ordinarily expect for it, or an unfamiliar or novel interpretation. He also shows how the novel interpretations thus generated may be used to help analyse philosophical problems such as vagueness and the open future. His book will be valuable for graduates and specialists in logic, philosophy of logic, and philosophy of language.
Elementary Logic explains what logic is, how it is done, and why it can be exciting. The book covers the central part of logic that all students have to learn: propositional logic. It aims to provide a crystal-clear introduction to what is often regarded as the most technically difficult area in philosophy. The book opens with an explanation of what logic is and how it is constructed. Subsequent chapters take the reader step-by-step through all aspects of elementary logic. Throughout, ideas are explained simply and directly, with the chapters packed with overviews, illustrative examples, and summaries. Each chapter builds on previous explanation and example, with the final chapters presenting more advanced methods. After a discussion of meta-logic and logical systems, the book closes with an exploration of how paradoxes can exist in the world of logic. Elementary Logic's clarity and engagement make it ideal for any reader studying logic for the first time.
For a full history of axiomatic theories of truth I would have to go back very far in history. Many topics and ideas found in what follows have been foreshadowed. For instance, even theories structurally very similar to axiomatic compositional theories of truth can be found in Ockham's Summa Logicae, even though Ockham like many other philosophers paid lip service to the correspondence theory of truth.
Relating historical to more recent accounts of truth is often difficult as it is seldom clear whether certain sentences of a particular account should be understood as definitions, descriptions, consequences of a theory, or as axioms.
I think it is safe to claim that the modern discussion of formal axiomatic theories of truth began with Tarski's The Concept of Truth in Formalized Languages (the reader might want to consult Künne (2003) on the development leading up to Tarski). Tarski proved some fundamental results about axiomatic theories although he did not adopt an axiomatic approach. Famously Tarski proposed a definition of truth for certain languages in another more comprehensive language, called the metalanguage. There were, and still are, good motives to aim at a definition rather than a mere axiomatization of truth: if one is wondering whether truth should be considered a legitimate notion at all, a definition might be useful in dispersing doubts about its legitimacy. Tarski wrote his paper when most members of Vienna Circle and the Warsaw School suspected truth to be a concept that should be avoided in good philosophy (see Woleński and Simons 1989 for a historical account).
Before delving into the formal details and logical analysis of axiomatic truth theories, I would have preferred to discuss further philosophical issues and the motivations for the technical development. But without being able to refer to the logical machinery, I find it hard to do so. Hence I will now tackle the formal part of my project and postpone the treatment of the philosophical issues until the last part.
Peano arithmetic
In discussing axiomatic systems, I will occasionally distinguish between formal systems and theories.
A formal system is a collection of axioms and rules for generating theorems. Almost all the systems I am going to discuss are formulated in classical logic. In most cases it does not matter exactly which logical calculus is used. In some cases, however, it will be necessary to specify the exact logical rules, and in these cases I will use a sequent calculus, as described in many standard textbooks (Troelstra and Schwichtenberg 2000, for instance).
A theory is a set of formulae closed under first-order logical consequence. Thus a theory may be generated by many different formal systems. However, in many cases, when the difference does not matter, I will not clearly distinguish between theories and the systems that generate them.
Tarski's resolution of the liar paradox by distinguishing an object language from a metalanguage underlies much of modern logic. All the theories in the previous part rely on this approach, although both languages are very similar.
In Tarski's terminology a language is more like what one would call a deductive system or a theory in modern terminology. The axioms and rules of the object language do not matter for Tarski's purpose; what are crucial are the axioms and rules of the metalanguage.
For instance, in typed theories like TB or CT the object language is the language L of arithmetic with PA as the associated system and the metalanguage is the same language expanded merely by an additional predicate symbol T for truth. In each case the object theory is a subtheory of the metatheory and the metatheory is formulated in a language – here LT – properly extending the object language. In the hierarchical theories the theory RTγ is a metalanguage for RT<γ.
Despite the success of Tarski's resolution of the paradoxes in this way, many philosophers have been discontent with a Tarskian approach and so have developed alternative proposals, trying to minimize or close the gap between object and metalanguage.
The Kripke–Feferman theory is based on Strong Kleene logic as an evaluation schema. Kripke (1975) formulated his semantic construction is such a way that other evaluation schemata can be used as well. In the definition of Λ on p. 194 the relation ⊨SK of being valid in a model under Strong Kleene logic can be replaced with other notions of validity in a model. The relation replacing ⊨SK must satisfy a certain condition; otherwise the operation corresponding to Ë may lack fixed points. Here I will not go into general results on admissible evaluation schemata; rather I shall focus on axiomatic theories akin to KF but based on two alternative evaluation schemata.
First, I consider the standard Weak Kleene logic with only truth-value gaps and no gluts. In Strong Kleene logic a disjunction of two sentences is true if at least one disjunct is true, even when the other disjunct lacks a truth value. In Weak Kleene logic a sentence is evaluated as neither true nor false if one of its components lacks a truth value. The truth tables for Weak Kleene logic are thus easily described: whenever there is a truth-value gap among the entries of a line the value of the complex sentence will also be a gap. All other lines coincide with the lines of the truth tables of classical logic. One can easily develop Kripke's semantic construction for Weak Kleene logic.