To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter follows on from the previous chapter in quantum statistical mechanics but specialising on systems with identical particles. Using Gibbs prescritpion on generic states from Chapter 3, the occupation number representation is introduced. Constraints imposed on statistics by irreducible representations of the permutation group are discussed. These group-theoretic considerations are used to justify the use of Gentile’s parastatistics. Fermions and bosons are introduced as special cases of Gentile’s statistics, corresponding to the trivial representation of the permutation group for bosons, and the sign representation of the permutation group for fermions. Basic applications to fermions and bosons is given, including the Fermi–Dirac and Bose–Einstein statistics. A detailed expose of why photons are said to have zero chemical potential is also proposed.
The foundations of modern probability theory are briefly presented and discussed for both discrete and continuous stochastic variables. Without daring to give a rigorous mathematical construction – but citing different extremely well-written handbooks on the matter – the axiomatic theory of Kolmogorov and the concepts of joint, conditional, and marginal probability are introduced, along with the operations of union and intersection of generic random events. Eventually, Bayes’ formula is put forward with some examples. This will be at the cornerstone of statistical inference methods reported in Chapters 5 and 6.
Our last chapter is devoted to entropy. With this excuse we first present Shannon’s information theory, including the derivation of his entropy, and the enunciations and proofs of the source coding theorem and of the noisy-channel coding theorem. Then, we consider dynamical systems and the production of entropy in chaotic systems, termed Kolmogorov–Sinai entropy. For non-experts or readers who require a memory jog, we make a short recap of statistical mechanics. That is just enough to tie up some knots left untied in Chapter 4, when we developed large deviations theory for independent variables. Here we generalize to correlated variables and make one application to statistical mechanics. In particular, we find out that entropy is a large deviations function, apart from constants. We end with a lightning fast introduction to configurational entropy in disordered complex systems. Just to give a tiny glimpse of … what we do for a living!
This chapter discusses the notions of state of matter and phase of matter. It looks at two categories of ‘anomalous behaviours’ in thermodynamics: pressure plateaus in the isotherms of real gases, and the appearance of a magnetic state in ferromagnet. The former situation lends itself to a thermodynamic analysis with the van der Waals equation of state. A full analysis is proposed and the interpretation of the pressure plateau as stemming from the coexistence between two different phases at different densities is identified. Various laws, such as the latent heat law and Clapeyron’s law, are derived as well from thermodynamic theory. In the case of magnetism, there is no equation of state that would play a role analogous to the van der Waals equation of state. Statistical mechanics is required to understand the physics at play in the system. This is done by looking at the paradigmatic Ising model. The mean-field approach to this model is proposed and the existence of a ferromagnetic phase, breaking the underlying symmetry of the system, is observed.
This chapter lays the foundation of probability theory, which has a central role in statistical mechanics. It starts the exposition with Kolmogorov’s axioms of probability theory and develops the vocabulary through example cases. Some time is spent on sigma algebras and the role they play in probability theory, and more specifically to properly define random variables on the reals. In particular, the popular notion that ‘the probability for a real variable to take on a single value’ is critically analysed and contextualised. Indeed, there are situations in statistical mechanics where some mechanical variables on the reals do get a non-zero probability to take on a single value. Moments and cumulants are introduced, as well as the method of generating functions, which prepare the ground as efficacious tools for statistical mechanics. Finally, Jaynes’s least-biased distribution principle is introduced in order to obtain a priori probabilities given some constraints imposed on the system.
Here we face the analysis of another kind of memoryless discrete process: branching processes, otherwise termed “chain reactions” under more physical inspiration. Before that, we carefully deepen and generalize the knowledge of the very useful tool of generating functions. This will be soon applied to the study of the dynamics of a population, predicting whether it will certainly be extinct – and how fast – or it will be self-sustaining.
In this chapter we study the first example of a correlated memoryless phenomenon: the famous “drunkard’s walk”, formally termed the random walk. We begin from a very simple case, in a homogeneous and isotropic space on a discrete hypercubic lattice. Then we add traps here and there. Eventually we make a foray into the continuous regime, with the Fokker–Planck diffusion equation (which, we see, is what physicists call a Schrödinger equation in imaginary time), and the stochastic differential Langevin equation.
This chapter builds upon the previous chapters, applying the method of combining probability theory with Hamiltonian mechanics. To do so, one needs to build a meaningful sample space over states, in this case, quantum states. A substantial part of the chapter discusses how to construct these quantum states out of which one can build a sample space on which to apply a probability measure. Vector states and density operators are introduced and various worked examples are proposed. Once the quantum sample space is identified, the equilibrium quantum statistical mechanics is formulated. The ‘particle in a box’ problem turns out to be analytically intractable, unless we take a certain limit called the semi-classical limit. Heuristics as to what this limit means are proposed. Finally, the von Neumann (quantum) entropy is introduced and analogies with thermodynamics are made. An application to the heat capacity of solids is presented. As complement, the chapter also introduces a classical ‘ring-polymer’ analog of quantum statistical mechanics stating the formal equivalence between a one-particle quantum canonical system and an N-particle classical canonical system.
Here we return to discrete Markov processes, but this time with continuous-time processes. We first consider, study, and solve specific examples such as Poisson processes, divergent birth processes, and birth-and-death processes. We derive the master equations for their probability distributions, and derive and discuss important solutions. In particular, we deepen the theory of Feller for divergent birth processes. In the end we formally study the general case of Markov processes in the stationary case, writing down the forward and the backward Kolmogorov master equations.
Analysis of experimental scalar data is tackled here. Starting from basic analysis of a large number of well-behaved data, eventually displaying Gaussian distributions, we move on to Bayesian inference and face the cases of few (or no) data, sometimes badly behaved. We first present methods to analyze data whose ideal distribution is known, and then we show methods to make predictions even when our ignorance about the data distribution is total. Eventually, various resampling methods are provided to deal with time-correlated measurements, biased estimators, anomalous data, and under- or over-estimation of statistical errors.
This short chapter aims at motivating the interest for statistical mechanics. It starts by a brief description of the historical context within which the theory has developed, and ponders its status, or lack thereof, in the public eye. A first original parallel of the use of statistics with mechanics is drawn in the context of error propagation analysis, which can also be treated within statistical mechanics. With regard to situations, statistical mechanics can be applied for, two categories are distinguished: experimental/protocol error or observational state underdetermining the mechanical state of the system. The rest of the chapter puts the emphasis on this latter category, and explains how statistical mechanics plays the role of ‘Rosetta Stone’ translating between different modes of description of the same system, thereby giving tools to infer relations between observational variables, for which we usually do not have any fundamental theory, from the physics of the underlying constituents, which is presumed to be that of Hamiltonian classical or quantum mechanics.