To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This book is an attempt to provide an introduction to some parts, more or less important, of a subfield of elementary and analytic number theory, namely the field of arithmetical functions. There have been countless contributions to this field, but a general theory of arithmetical functions does not exist, as yet. Interesting questions which may be asked for arithmetical functions or “sequences” are, for example,
(1) the size of such functions,
(2) the behaviour in the mean,
(3) the local behaviour,
(4) algebraic properties of spaces of arithmetical functions,
(5) the approximability of arithmetical functions by “simpler” ones.
In this book, we are mainly concerned with questions (2), (4) and (5). In particular, we aim to present elementary and analytic results on mean-values of arithmetical functions, and to provide some insight into the connections between arithmetical functions, elements of functional analysis, and the theory of almost-periodic functions.
Of course, standard methods of number theory, such as the use of convolution arguments, Tauberian Theorems, or detailed, skilful estimates of sums over arithmetical functions are used and given in our book. But we also concentrate on some of the methods which are not so common in analytic number theory, and which, perhaps for precisely this reason, have not been refined as have the above. In respect of applications and connections with functional analysis, our book may be considered, in part, as providing special, detailed examples of well-developed theories.
ABSTRACT. This chapter deals with multiplicative arithmetical functions f, and relations between the values of these functions taken at prime powers, and the almost periodic behaviour of f. More exactly, we prove that the convergence of four series, summing the values of f at primes, respectively prime powers [with appropriate weights], implies that f is in ℬq, and (if in addition the mean-value M(f) Is supposed to be non-zero) vice versa. For this part of the proof we use an approach due to H. Delange and H. Daboussi 119761 in the special case where q = 2; the general case is reduced to this special case using the properties of spaces of almost-periodic functions obtained in Chapter VI. Finally, Daboussi's characterization of multiplicative functions in Aqwith non-empty spectrum is deduced.
INTRODUCTION
As shown in the preceding chapter, q-almost-even and q-almost-periodic functions have nice and interesting properties; for example, there are mean-value results for these functions (see VI.7) results concerning the existence of limit distributions and some results on the global behaviour of power series with almost-even coefficients. These results seem to provide sufficient motivation in the search for a, hopefully, rather simple characterization of functions belonging to the spaces Aq ⊃ Dq ⊃ ℬq of almost-periodic functions, defined in VI. 1. Of course, in number theory we look for functions having some distinguishing arithmetical properties, and the most common of these properties are additivity and multiplicativity.
Probability was connected to quantum theory right from the start, in 1900, through the derivation of Planck's radiation law. But not much attention has been paid to the concept of probability in quantized radiation, or in the ‘quantum jumps’ from one energy level to another in an atom, and related problems. A whole chapter or book, instead of a section, could be written on the background of quantum theory in statistical mechanics and spectral analysis with this aspect in mind. Probabilistic properties were included in very many of the most important papers dealing with radiation between 1900 and 1925. It became also clear in time that Planck's law could not be derived from classical physics. Einstein admits this around 1908. Later he said, in a work of 1917, that it is a weakness of the theory of photon emission that ‘it leaves to “chance” the time and direction of the elementary processes,’ thereby, according to Pais (1982, p. 412), making explicit ‘that something was amiss with classical causality.’ It remains somewhat open how the origins and shifting interpretations of probability in the old quantum theory affected the acceptance of the probabilistic interpretation of the new quantum mechanics of 1925–1926, and of the indeterminism that found its confirmation in Heisenberg's uncertainty relation in 1927.
CONCEPTS OF PROBABILITY IN CLASSICAL STATISTICAL PHYSICS
The definition of probability
Around the middle of the last century, the mechanical theory of heat had won ground over caloric and other competing theories. The idea that heat consists of the motion of molecules was not new; it can be found already in Daniel Bernoulli in 1738. But in the 1840s the principle of conservation of energy was finally established. It became the foundation of the science of thermodynamics and was referred to as its first law. The second law of thermodynamics is the famous entropy principle, a name given by Rudolph Clausius. In its strict form, it professes the necessary equalization of all temperature differences: The gloomy finale of the world stage is a heat death of the universe. Such a thermodynamical prediction loses its edge in statistical physics, which says that temperature equalization is not necessary but only extremely probable. In order to illustrate the statistical character of the second law, Maxwell imagined in the late 1860s a small demon, monitoring molecular motions and thereby being able to work against temperature equalization. These thermodynamic and probabilistic issues within physics also had a more general cultural significance. The prospect of heat death connected with a pessimistic sentiment seeing degeneration as the essential direction of things. Its counterpoint was the emergence of Darwin's evolutionary theory with which some creators of the kinetic theory, Boltzmann especially, felt close sympathies.
The concept of probability plays a vital role in many sciences, in the theory and practice of scientific inference, and in philosophy and the modern world view more generally. Recent years have brought a number of articles and books dealing with the emergence of probabilistic thinking in the last century and the earlier part of this century. Several studies of the history of statistics appeared in the 1980s. One also finds accounts of the development of statistical physics, of quantum theory and of fundamental questions such as determinism and indeterminism in modern physics. But nothing comparable exists on modern probability, the mathematical discipline that in some way or other is at the basis of any related studies. With the main focus on the shift from classical to modern probability in mathematics, I have attempted to combine in this book a historical account of scientific development with foundational and philosophical discussion.
Classical probability formed a chapter in applied mathematics. Its bearing on the larger questions of science and philosophy was limited. The shift to modern probability started around the turn of the century. By the late 1930s probability theory had become an autonomous part of mathematics. The developments of these early decades have been over-shadowed by the nearly universal acceptance of modern axiomatic and measure theoretic probability as embodied in the classic work of Andrei Kolmogorov of 1933, Grundbegriffe der Wahrscheinlichkeitsrechnung.
Nicole Oresme lived from approximately 1325 to 1382. He was a philosopher, mathematician and churchman. We shall here be interested in a very particular aspect of his work: the incommensurability of celestial motions. In many ways, though, it was at the center of his achievements. As background for the discussion of Oresme's mathematical results, let us review the elements of Ptolemaic astronomical models. Ptolemy, the greatest of the applied scientists of antiquity, in his astronomy assumed the Earth to be immobile, with the planets, the Sun and Moon orbiting around it in a motion consisting of several (up to three) uniform circular motions. There is a great circle, the epicycle, on which is attached the center of another circle, the deferent. On this circle, finally, is located the mobile object. Spatial coordinates are determined against the ‘sphere of the fixed stars.’ It of course rotates once a day around the Earth. Different combinations of sense and speed of rotation of the circles are able to account for phenomena such as the retrograde motion of a planet.
THE QUESTION OF THE PERIODICITY OF THE UNIVERSE
Starting with the Greeks, who are said to have invented the geometrical representation of the motions of celestial bodies, there has been a debate about the character of such geometric models. The crucial issue was, whether the models pertained directly to reality, or whether they were to be taken just as instruments for prediction.
SUBJECTIVE OR OBJECTIVE PROBABILITY: A PHILOSOPHICAL DEBATE
In the mechanical world view of last century's physics, the future course of events was thought determined from the present according to the mechanical principles governing all change. If there was any ignorance, it was completely located in the ignorant person's mind. It follows that probability stands only as a kind of index of the degree of ignorance. Laplace is, more than anyone else, responsible for this classical concept of probability. It can be found in his Essai philosophique sur les probabilités, written as a popular preface to the second (1814) edition of his extensive Théorie analytique des probabilités. There, in the classic passage on Laplacian determinism, we find him imagining ‘an intelligence which could comprehend all the forces by which nature is animated…for it, nothing would be uncertain and the future, as the past, would be present to its eyes’ (p. 4). The exactness of planetary motions was of course the practical reason for such confidence in determinism. But Laplace made a giant extrapolation from astronomy to the smallest parts of nature: ‘The curve described by a simple molecule of air or vapor is regulated in a manner just as certain as the planetary orbits; the only difference between them is that which comes from our ignorance’ (p. 6). In a deterministic universe there are no true probabilities, for complete knowledge would make all probabilities be 0 or 1.