To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The birth of conventionalism was inextricably linked to the emergence of the notion of implicit definition. As we saw in chapter 2, Poincaré justifies his construal of the axioms of geometry as conventions in terms of his proposal that they be viewed as disguised definitions rather than necessary truths. Although use of implicit definitions is not confined to conventionalists – Hilbert, for one, made extensive use of implicit definition in his Foundations of Geometry and later works without committing himself to conventionalism – the link between axioms and definitions, and thus between axioms and conventions, recurs in the literature. The logical positivists in particular were enthralled by the far-reaching implications of the construal of axioms as definitions: if we are as free to lay down axioms as we are to stipulate the meanings of terms in garden-variety definitions, conventionalism would appear to be vindicated. And if it works for geometry, why not seek to ground mathematical truth in general in definition, and thus in convention? Why not let definition serve as the basis for the entire sphere of a priori knowledge? In this chapter, I examine the notion of implicit definition and its putative connection to convention. In line with the approach taken in the other chapters, I argue for an account of implicit definition that does not rest on the idea that truth can be postulated ‘by convention’.
This book recounts the hitherto untold story of conventionalism. The profound impact conventionalism has had on seminal developments in both the science and the philosophy of the twentieth century is revealed through analysis of the writings of Poincaré, Duhem, Carnap, Wittgenstein, and Quine on the subject, and by examining the debate over conventionalism in the context of the theory of relativity and the foundations of mathematics. I trace the evolution of conventionalism from Poincaré's modest but precise initial conception through a number of extravagant extrapolations, all of which, I show, eventually collapsed under the weight of the problems they generated. My focus, however, is not history but analysis. The literature is replete with ambiguity as to what the meaning of ‘convention’ is, misunderstandings about the aims of conventionalism, and conflation of conventionalism with other philosophical positions, such as instrumentalism and relativism. The most serious confusion pertains to the notion of truth by convention typically associated with conventionalism. A central theme of this book is that conventionalism does not purport to base truth on convention, but rather, seeks to forestall the conflation of truth and convention.
Much of twentieth-century philosophy was characterized by engagement in determining the limits of meaning and countering the tendency to ascribe meaning to meaningless expressions. Conventionalism, correctly understood, is motivated by a desire to mitigate deceptive ascription of truth. To the conventionalist, the very idea of truth by convention is as incongruous as that of meaningful nonsense.
Quine concludes one of his better-known essays on logical truth in characteristically poetic style:
The lore of our fathers is a fabric of sentences …. It is a pale gray lore, black with fact and white with convention. But I have found no substantial reasons for concluding that there are any quite black threads in it, or any white ones.
([1960] 1966, p. 125)
It is tempting to press the metaphor further and inquire whether, at the end of the day, viewing the web of belief in this grayish light is itself a form of conventionalism. In a way, I do address this question here, though it is not my primary focus. Rather, this chapter examines the role of the web metaphor in Quine's various arguments against conventionalism, particularly, his early critique of conventionalism in “Truth by Convention.” More generally, it attempts to relate the metaphor to the development of Quine's philosophy of language, from his (approving) lectures on Carnap's Logical Syntax of Language, to his thesis of the indeterminacy of translation. More generally still, the metaphor merits examination in the context of other philosophical attempts to identify the conventional elements in truth (or alleged truth), from Poincaré onward. Finally, and most significantly, the chapter shows that Quine eventually deconstructed his own metaphor, thereby undermining the image usually thought of as epitomizing his philosophy of language.
The cluster of problems surrounding the notion of convention and its counterpart, the notion of truth, have always been at the very heart of philosophical inquiry. This book examines a relatively recent round in this ongoing discussion, beginning with Poincaré and ending with Quine and the later Wittgenstein. It is only during this period that the notion of convention comes to be associated with an ‘ism,’ a distinct philosophical position. I will focus on the philosophy of science and mathematics, setting aside other realms of philosophy, such as ethics and political theory, in which questions about the role of convention also figure prominently. Although a wide spectrum of positions fall under the rubric “conventionalism,” all explore the scope and limits of epistemic discretion. On the prevailing conception, conventionalism has been taken to extend the scope of discretion to the very stipulation of truth. The thrust of the present study is a critique of this reading.
The various chapters of this book are largely self-contained, but when brought to bear on one another, they provide not only a new understanding of conventionalism, but a reframing of central themes of twentieth-century philosophy.
My debts to teachers, colleagues, students, and others who have written on the aforementioned questions are, of course, numerous. I would like to mention, in particular, Yehuda Elkana, Hilary Putnam, and the late Frank Manuel, who introduced me to the history and philosophy of science; my late physics teacher Ruth Stern, who imparted to her students a feel for the beauty of physics; and my late friends Amos Funkenstein and Mara Beller, who passed away at the peak of their creative careers.
Whereas the philosophers discussed in previous chapters take a relatively unequivocal stand for or against certain conventionalist arguments, Wittgenstein's later philosophy is baffling: it seems both to explicitly affirm conventionalism, and persistently attack it. This tension is manifest, in particular, in Wittgenstein's critique of traditional notions of necessary truth, an issue as pivotal to Wittgenstein's own thought as it is to conventionalism. Hence in deciding whether Wittgenstein's later philosophy should be deemed a variant of conventionalism, the problem is not to determine whether or not his ideas on the nature of logical and mathematical truth fit a particular label or reflect the positions usually associated with conventionalism. The question, rather, is whether his ambivalent, if not conflicting, attitudes toward conventionalism can be reconciled.
Let me be more specific. In one of his lectures, Wittgenstein remarks: “One talks of mathematical discoveries. I shall try again and again to show that what is called a mathematical discovery had much better be called a mathematical invention” (1976, LFM, p. 22). This distinction between discovery and invention seems to imply a contrast between objective truths, over which we have no control, and those created via stipulation, which are up to us – that is, conventions. Further, the notion of grammatical rules, and that of rules constituting practices such as counting and measuring, which pervade Wittgenstein's later philosophy, also point to a conventionalist account of necessary truth.
(1) It is in principle impossible ever to give an example of an unknowable fact. (2) While universalizations do enable us to assert truths about nonsurveyable totalities, such totalities nevertheless serve to demarcate us as cognitively finite beings. (3) For general facts regarding an open-ended group will – when contingent and not law-inherent – open the door to facts that are beyond the cognitive grasp of finite knowledge. (4) Nevertheless, as Kant insightfully saw it, the realm of knowledge – of ascertainable fact – while indeed limited, is nevertheless unbounded.
Limits of Knowledge
The cognitive beings that will concern us here are language-dependent finite intelligences. These by their very nature are bound to be imperfect knowers. For the factual information at their disposal by way of propositional knowledge that something or other is the case will – unlike practical how-to knowledge – have to be verbally formulated. And language-encompassed textuality is – as we have seen – outdistanced by the facts themselves. Just what is one to make of the numerical disparity between facts and truths, between what is knowable in theory and what our finite intelligences can actually manage to know? Just what does this disproportion portend?
It means that our knowledge of fact is incomplete – and inevitably so! – because we finite intelligences lack the means for reality's comprehensive characterization. Reality in all its blooming buzzing complexity is too rich for faithful representation by the recursive and enumerable resources of our language.
(1) The issue of importance is obviously a crucial factor for the utilization of information. (2) This, however, is something subject to decidedly different degrees, ranging from the pedestrianly routine to the uncontestably first-rate. (3) The phenomenon of quality retardation so functions that as our information grows, the rate of growth at increasingly higher quality is ever diminished. (4) Higher quality elites are not just increasingly exclusive but slower growing as well. (5) The quality of information is reflected in the structural constitution of texts and in the taxonomies that such structural divisions reflect. (6) In approximation, at least, importance can also be assessed in terms of citations that reflect the utilization of texts in the literature of their subject.
The Centrality of Importance
The previous discussion has conceived of knowledge in terms of top quality, first-rate information. But this is something of an oversimplification. For in cognitive matters, our questions and answers, problems and findings, come in various sizes – some virtually trivial, others portentous, and many somewhere in between. And the difference matters a great deal here. For it is the biggies that figure prominently in textbooks and histories, while the smallies get a footnote at best and blank omission for the most part. Prizes, recognition, and career advancement reward the big, while indifference befalls the small.
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind: it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science.
– William Thomson, Lord Kelvin (1824–1907), English Physicist
This book develops the theory of knowledge from a quantitative perspective that serves to throw light on the scope and limits of human knowledge. It seeks to provide theorists of knowledge in philosophy, information theory, cognitive studies, communication theory, and cognate disciplines with the conceptual tools required for a quantitative treatment of the products of inquiry.
Kelvin's dictum is an exaggeration that takes things too far. I have never thought for a moment that if you cannot say it with numbers that it just is not worth saying. But all the same, I do firmly believe that where you cannot put numbers to work you will understand the matter better and more clearly for being able to explain why. So it seems well worth-while to see what a qualitative approach to knowledge can do for us.
The discipline represented by the domain of inquiry to which the present book is addressed does not as yet exist. Epistemetrics is not yet a scholarly specialty.
(1) Cognitive progress is subject to Kant's Principle of Question propagation to the effect that new knowledge always brings new questions in its wake. (2) However, the increase of mere information does not yield a corresponding increase in knowledge: knowledge is not proportional to the volume of information, but only to its logarithm. This key epistemological principle traces back at least to Edward Gibbon. (3) Gibbon's Law of Logarithmic Returns as a principle of the realm of conception parallels the Weber-Fechner Law in the epistemics of perception. (4) The Law of Logarithmic Returns accounts for Max Planck's Thesis that scientific progress becomes ever more difficult, so that diminishing returns on effort are an unavoidable facet of inquiry.
Kant's Principle of Questions Propagation
New knowledge that emerges from the progress of inquiry can bear very differently on the matter of questions. Specifically, we can discover
New (that is, different) answers to old questions.
New questions.
The inappropriateness or illegitimacy of our old questions.
With (1) we learn that the wrong answer has been given to an old question: we uncover an error of commission in our previous question-answering endeavors. With (2) we discover that there are certain questions that have not heretofore been posed at all: we uncover an error of omission in our former question-asking endeavors. Finally, with (3) we find that we have asked the wrong question altogether: we uncover an error of commission in our former question-asking endeavors, which are now seen to rest on incorrect presuppositions (and are thus generally bound up with type [1] discoveries).
In concluding, a brief survey of the principal theses may be in order. They stand as follows:
Duhem's Law of Security/Detail Complimentarity. The security and detail of our knowledge stand in a relation of inverse proportionality. (Chapter 1)
Kant's Principle of Cognitive Systematization. Knowledge, in the qualitative and honorific sense of the term, is a matter of the extent to which information is coherently systematized. (Chapter 2)
Spencer's Law of Cognitive Development. Cognitive progress is accompanied by complexification and can be assessed in terms of the taxonomic complexity of the information manifold at hand. However, this complexity is not proportional of the amount of information, but to its logarithm. And this yields –
Kant's Principle of Question Propagation. The progress of knowledge-development in the course of resolving our questions always brings new questions to light. (Chapter 4)
Gibbon's Law of Logarithmic Returns. The quantity of knowledge interest in a body of information is proportional not to the size of this body, but merely to the logarithm thereof. (Chapter 4)
Adams's Thesis of Exponential Growth. Throughout recent history the body of scientific information has been growing exponentially. But, in view of Gibbon's Law, this means that cognitive progress in terms of actual knowledge has been growing at a rate that is merely linear and thereby stable. (Chapter 5)
Quality/Quantity Alignment. The lower levels of informative quality information that define the lesser degrees of “knowledge” are in volumetric alignment with the λ-power (0 < λ ≤ 1) of the total amount of information at hand. (Chapter 6)
(1) Knowledge is not just a matter of information as such, but of information that is coherently and cohesively systematized. (2) This view of knowledge as properly systematized information – in effect, information as structured in an idealized expository treatise – goes back to Immanuel Kant. (3) Cognitive systematization is hierarchical in structure because a systemic organization of the exposition of the information at issue into successively subordinate units becomes paramount here. And, viewed in this light, structure will of course reflect significance with larger units dominating over subordinate ones.
Distinguishing Knowledge and Information
The interplay between knowledge and information is pivotal for the present deliberations. Actual information (in contrast with misinformation) requires little more than truth. But knowledge is something far more demanding: it calls for information that is organized, purified, systematized. It makes no sense to say “It is known that p, but it may possibly not be so (or … “there are considerations that lead to doubt about it”).” From the cognitive point of view, knowledge is money in the bank. It must fit together coherently. The very concept of knowledge is such that what is known must be systemically consolidated: the matter of quality will also play a crucial role. For items of information are not created equal. Some are minute and trivial, others large and portentous. So there is little point to merely doing a nose count here. Only information that is scrutinized, verified, coordinated, and systematized can plausibly qualify to be regarded as knowledge.