To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
CONCEPTS OF PROBABILITY IN CLASSICAL STATISTICAL PHYSICS
The definition of probability
Around the middle of the last century, the mechanical theory of heat had won ground over caloric and other competing theories. The idea that heat consists of the motion of molecules was not new; it can be found already in Daniel Bernoulli in 1738. But in the 1840s the principle of conservation of energy was finally established. It became the foundation of the science of thermodynamics and was referred to as its first law. The second law of thermodynamics is the famous entropy principle, a name given by Rudolph Clausius. In its strict form, it professes the necessary equalization of all temperature differences: The gloomy finale of the world stage is a heat death of the universe. Such a thermodynamical prediction loses its edge in statistical physics, which says that temperature equalization is not necessary but only extremely probable. In order to illustrate the statistical character of the second law, Maxwell imagined in the late 1860s a small demon, monitoring molecular motions and thereby being able to work against temperature equalization. These thermodynamic and probabilistic issues within physics also had a more general cultural significance. The prospect of heat death connected with a pessimistic sentiment seeing degeneration as the essential direction of things. Its counterpoint was the emergence of Darwin's evolutionary theory with which some creators of the kinetic theory, Boltzmann especially, felt close sympathies.
The concept of probability plays a vital role in many sciences, in the theory and practice of scientific inference, and in philosophy and the modern world view more generally. Recent years have brought a number of articles and books dealing with the emergence of probabilistic thinking in the last century and the earlier part of this century. Several studies of the history of statistics appeared in the 1980s. One also finds accounts of the development of statistical physics, of quantum theory and of fundamental questions such as determinism and indeterminism in modern physics. But nothing comparable exists on modern probability, the mathematical discipline that in some way or other is at the basis of any related studies. With the main focus on the shift from classical to modern probability in mathematics, I have attempted to combine in this book a historical account of scientific development with foundational and philosophical discussion.
Classical probability formed a chapter in applied mathematics. Its bearing on the larger questions of science and philosophy was limited. The shift to modern probability started around the turn of the century. By the late 1930s probability theory had become an autonomous part of mathematics. The developments of these early decades have been over-shadowed by the nearly universal acceptance of modern axiomatic and measure theoretic probability as embodied in the classic work of Andrei Kolmogorov of 1933, Grundbegriffe der Wahrscheinlichkeitsrechnung.
Nicole Oresme lived from approximately 1325 to 1382. He was a philosopher, mathematician and churchman. We shall here be interested in a very particular aspect of his work: the incommensurability of celestial motions. In many ways, though, it was at the center of his achievements. As background for the discussion of Oresme's mathematical results, let us review the elements of Ptolemaic astronomical models. Ptolemy, the greatest of the applied scientists of antiquity, in his astronomy assumed the Earth to be immobile, with the planets, the Sun and Moon orbiting around it in a motion consisting of several (up to three) uniform circular motions. There is a great circle, the epicycle, on which is attached the center of another circle, the deferent. On this circle, finally, is located the mobile object. Spatial coordinates are determined against the ‘sphere of the fixed stars.’ It of course rotates once a day around the Earth. Different combinations of sense and speed of rotation of the circles are able to account for phenomena such as the retrograde motion of a planet.
THE QUESTION OF THE PERIODICITY OF THE UNIVERSE
Starting with the Greeks, who are said to have invented the geometrical representation of the motions of celestial bodies, there has been a debate about the character of such geometric models. The crucial issue was, whether the models pertained directly to reality, or whether they were to be taken just as instruments for prediction.
SUBJECTIVE OR OBJECTIVE PROBABILITY: A PHILOSOPHICAL DEBATE
In the mechanical world view of last century's physics, the future course of events was thought determined from the present according to the mechanical principles governing all change. If there was any ignorance, it was completely located in the ignorant person's mind. It follows that probability stands only as a kind of index of the degree of ignorance. Laplace is, more than anyone else, responsible for this classical concept of probability. It can be found in his Essai philosophique sur les probabilités, written as a popular preface to the second (1814) edition of his extensive Théorie analytique des probabilités. There, in the classic passage on Laplacian determinism, we find him imagining ‘an intelligence which could comprehend all the forces by which nature is animated…for it, nothing would be uncertain and the future, as the past, would be present to its eyes’ (p. 4). The exactness of planetary motions was of course the practical reason for such confidence in determinism. But Laplace made a giant extrapolation from astronomy to the smallest parts of nature: ‘The curve described by a simple molecule of air or vapor is regulated in a manner just as certain as the planetary orbits; the only difference between them is that which comes from our ignorance’ (p. 6). In a deterministic universe there are no true probabilities, for complete knowledge would make all probabilities be 0 or 1.
Grundbegriffe der Wahrscheinlichkeitsrechnung by Andrei Kolmogorov is the book which has become the symbol of modern probability theory, its year of appearance 1933 being seen as a turning point that made earlier studies redundant. In mathematics, it is fairly common to take a field of study as given, as being defined by a set of commonly accepted postulates. Kolmogorov's presentation of probability in terms of measure theory serves well to illustrate this supposedly ahistorical character of mathematical research: With some knowledge of set theory, one can take the book, and learn and start doing probability theory. In such an approach, the concepts and the structure of probability theory appear fixed, whereas the experience of those who built up modern probability must have been very different. There were many kinds of approaches to the foundations of the subject. The idea of a measure theoretic foundation was almost as old as measure theory itself, and it had been repeatedly presented and used in the literature. Therefore the mere idea was not the reason for the acceptance of Kolmogorov's measure theoretic approach, but rather what he achieved by the use of measure theoretic probabilities. The change brought about by Kolmogorov was a big step, but not the kind of dramatic revelation some later comments suggest. The two essential mathematical novelties of Grundbegriffe were the theory of conditional probabilities when the condition has probability 0, and the general theory of random or stochastic processes.
FIRST STEPS IN MEASURE THEORETIC PROBABILITY. AXIOMATIZATION
Gyldér's problem in continued fractions
Measure theory originated at the end of the last century from problems encountered mainly in mathematical analysis, the theory of trigonometric series, and integration theory. Measure first was a generalization of geometric measure in Euclidean space. Current measure theory originated as an abstraction from making the concepts independent of real numbers and real spaces. This abstract kind of measure theory was first given in Fréchet (1915).
In Borel (1898) a generalization of length on the real line was proposed which is now called the Borel measure. The definition is repeated in Borel's first paper on probability (1905b): First measurable sets are defined as consisting of closed intervals, finite or denumerable unions of closed intervals, and complements relative to a given measurable set. The Borel measure of an interval [a, b] is b – a, that of a denumerable set of pairwise disjoint closed intervals the sum of the lengths of the intervals, and the measure of a complement E – F the measure of E minus that of F. If an arbitrary set E is contained in a measurable set A of measure α and contains a measurable set B of measure β, its measure m is less than or equal to α and greater than or equal to β.
Sheer curiosity led me to read more and more of the old literature on probability. These explorations resulted in a number of papers during the 1980s, and at a certain point it occurred to me that I should try to write down the results of my efforts in a systematic way. No one had pursued the background of modern probability in any detail, so that I felt free to let my own particular interests act as my guide. As a result, the emphasis here is on foundational questions.
A historical–philosophical colleague may find the references to secondary literature few in number. This is due to the fact that literature on the development of modern probability is truly scarce, and it is also due to my insistence on consulting all the primary sources in the first place. I also wanted to combine a useful bibliography of primary sources with manageable length. There should still be enough indications to allow the uninitiated to begin reading the secondary literature.
I am indebted to several institutions for support during the period of writing the book. Most important, a fellowship of the Academy of Finland has secured the continuity of my researches. In 1982, I had the good fortune of being invited to join the project ‘Probabilistic Revolution’ at the Zentrum für interdisziplinäre Forschung at Universitä Bielefeld.
The two main interpretive ideas about probability for the times under discussion are the frequentist and the subjectivist. Frequentist probability has had a remarkable role in the development of modern probability. It was the object of von Mises' theory, and it was more or less tacitly assumed as the interpretation of probability by the main proponents of mathematical probability in the 1920s. No attention was paid to the idea of subjective probabilities. The most one can find are Borel's philosophical essay (1924), which is a review of Keynes' Treatise on Probability of 1921, Borel's papers on game theory, and Lévy's discussion of subjective probability in his book of 1925. Borel's main contribution to probability (1909a), though, was from much earlier times.
The idea of subjective probabilities was further undermined by the developments in physics: Classical statistical physics already contained a commitment to statistical probability at least, and quantum mechanics brought a new kind of fundamental indeterminism into the description of nature's basic processes in 1926. An epistemic notion of probability must at that time have seemed like a thing from the past, from the shadows of the Laplacian doctrine of mechanical determinism.
In the 1920s philosophical thinking, or at least the part of it sensitive to scientific developments, was transformed through the rise of logical empiricism. Heisenberg's quantum mechanics of 1925 is one example of a scientific theory that shows the mutual effect of philosophical ideas and theory construction.
Richard von Mises was an applied mathematician. He first specialized in mechanics, hydrodynamics especially. By applied, he really meant it: A book of 1918, for example, dealt with the ‘elements of technical hydromechanics.’ Another related specialty was the theory of flight, much in vogue early on in the century. His work on probability starts properly around 1918, and from the same time are his first writings on foundational problems in science: on foundations of probability in 1919, and on classical mechanics in 1920. Von Mises' philosophical book Wahrscheinlichkeit, Statistik und Wahrheit of 1928 was the third volume in the Vienna Circle series ‘Schriften zur wissenschaftlichen Weltauffassung’, edited by Philipp Frank and Moritz Schlick. The year 1931 marked the publication of von Mises' big book on probability theory, Wahrschein-lichkeitsrechnung, whose exact title adds, ‘and its application in statistics and theoretical physics.’ The posthumous Mathematical Theory of Probability and Statistics is based on lectures from the early 1950s.
Von Mises was a declared positivist, identifying himself with the philosophy of the Berlin group, the Vienna Circle, and the Unity of Science Movement. His Kleines Lehrbuch des Positivismus (1939) appeared in an English version in 1951 as Positivism: A Study in Human Understanding. It attempts to give a broad presentation of the logical empiricist world view, from foundations of knowledge and the sciences to morals and society.
In this paper the Hausdorff dimension of systems of real linear forms which are simultaneously small for infinitely many integer vectors is determined. A system of real linear forms,
where ai, xij∈ℝ, 1 ≤i≤m, 1≤j≤n will be denoted more concisely as
where a∈⇝m, X∈ℝmn and ℝmn is identified with Mm × n(ℝ), the set of real m × n matrices. The supremum norm of any vector in k dimensional Euclidean space, ℝk will be denoted by |v|. The distance of a point a from a set B, will be denoted by dist (a, B) = inf {|a − b|: b ∈ B}.
We consider the fluid motion induced when a circular cylinder performs small-amplitude oscillations about an axis parallel to a generator to which it is rigidly attached as in Fig l(a). In common with other fluid flows dominated by oscillatory motion, a time-independent, or steady streaming develops, and this is the focus of our attention. In particular we relate our results, qualitatively, to the observations that have been made in experiments.
The analytic paracommutators in the periodic case have been studied. Their boundedness, compactness, the Schatten-von Neumann properties and the cut-off phenomena have been proved. These results have been applied to some kind of operators on the Bergman spaces that have cut-off at any p∈(0, ∞).
Norms with moduli of smoothness of power type are constructed on spaces with the Radon-Nikodym property that admit pointwise Lipschitz bump functions with pointwise moduli of smoothness of power type. It is shown that no norms with pointwise moduli of rotundity of power type can exist on nonsuperreflexive spaces. A new smoothness characterization of spaces isomorphic to Hilbert spaces is given.
In the case of F-isotropic groups for a global field F, Moore [Mo] computed the metaplectic kernel using crucially his theorem of uniqueness of reciprocity laws. For F-anisotropic G, a variant of Moore's theorem is, therefore, needed to compute the metaplectic kernel. Such a variant was announced by G. Prasad [GP1] (in 1986) and here we give the details.
Given a commutative semigroup (S, +) with identity 0 and u × v matrices A and B with nonnegative integers as entries, we show that if C = A – B satisfies Rado's columns condition over ℤ, then any central set in S contains solutions to the system of equations . In particular, the system of equations is then partition regular. Restricting our attention to the multiplicative semigroup of positive integers (so that coefficients become exponents) we show that the columns condition over ℤ is also necessary for the existence of solutions in any central set (while the distinct notion of the columns condition over Q is necessary and sufficient for partition regularity over ℕ\{1}).