We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this comprehensive volume, the authors introduce some of the most important recent developments at the intersection of probability theory and mathematical physics, including the Gaussian free field, Gaussian multiplicative chaos and Liouville quantum gravity. This is the first book to present these topics using a unified approach and language, drawing on a large array of multi-disciplinary techniques. These range from the combinatorial (discrete Gaussian free field, random planar maps) to the geometric (culminating in the path integral formulation of Liouville conformal field theory on the Riemann sphere) via the complex analytic (based on the couplings between Schramm–Loewner evolution and the Gaussian free field). The arguments (currently scattered over a vast literature) have been streamlined and the exposition very carefully thought out to present the theory as much as possible in a reader-friendly, pedagogical yet rigorous way, suitable for graduate students as well as researchers.
Play of Chance and Purpose emphasizes learning probability, statistics, and stochasticity by developing intuition and fostering imagination as a pedagogical approach. This book is meant for undergraduate and graduate students of basic sciences, applied sciences, engineering, and social sciences as an introduction to fundamental as well as advanced topics. The text has evolved out of the author's experience of teaching courses on probability, statistics, and stochastic processes at both undergraduate and graduate levels in India and the United States. Readers will get an opportunity to work on several examples from real-life applications and pursue projects and case-study analyses as capstone exercises in each chapter. Many projects involve the development of visual simulations of complex stochastic processes. This will augment the learners' comprehension of the subject and consequently train them to apply their learnings to solve hitherto unseen problems in science and engineering.
The chapter is fully dedicated to the theory of large deviations. To carry out the proof of the theorem and the actual computation of various distributions of large deviations, a detailed appendix is dedicated to the saddle point theorem to compute certain fundamental integrals, recurring in the theory. Lagrange transforms stem naturally from the large deviations theory, and we discuss their properties “in-line” for non-experts.
This is a rich chapter in which we delve into the study of the (weak and strong) laws of large numbers, and of the central limit theorem. The latter is first considered for sums of independent stochastic variables whose distributions have a finite variance, and then for variables with diverging variance. Several appendices report on both basic mathematical tools and lengthy details of computation. Among the first, the rules of variable change in probability are presented, Fourier and Laplace transforms are introduced, and their role as generating functionals of moments and cumulants, and the different kinds of convergence of stochastic functions are considered and exemplified.
Analysis of experimental data with several degrees of freedom is reported, starting from the Gaussian case, from the ground of the least-squares method, whose theory is detailed at the end of the chapter, for both independent and correlated data. The multi-dimensional versions of the reweighting method for unknown distributed data and of the bootstrap and the jackknife resampling methods are presented. How the possible correlation of multivariate data affects the methods is discussed and dealt with.
The foundations of modern probability theory are briefly presented and discussed for both discrete and continuous stochastic variables. Without daring to give a rigorous mathematical construction – but citing different extremely well-written handbooks on the matter – the axiomatic theory of Kolmogorov and the concepts of joint, conditional, and marginal probability are introduced, along with the operations of union and intersection of generic random events. Eventually, Bayes’ formula is put forward with some examples. This will be at the cornerstone of statistical inference methods reported in Chapters 5 and 6.
Our last chapter is devoted to entropy. With this excuse we first present Shannon’s information theory, including the derivation of his entropy, and the enunciations and proofs of the source coding theorem and of the noisy-channel coding theorem. Then, we consider dynamical systems and the production of entropy in chaotic systems, termed Kolmogorov–Sinai entropy. For non-experts or readers who require a memory jog, we make a short recap of statistical mechanics. That is just enough to tie up some knots left untied in Chapter 4, when we developed large deviations theory for independent variables. Here we generalize to correlated variables and make one application to statistical mechanics. In particular, we find out that entropy is a large deviations function, apart from constants. We end with a lightning fast introduction to configurational entropy in disordered complex systems. Just to give a tiny glimpse of … what we do for a living!
Here we face the analysis of another kind of memoryless discrete process: branching processes, otherwise termed “chain reactions” under more physical inspiration. Before that, we carefully deepen and generalize the knowledge of the very useful tool of generating functions. This will be soon applied to the study of the dynamics of a population, predicting whether it will certainly be extinct – and how fast – or it will be self-sustaining.
In this chapter we study the first example of a correlated memoryless phenomenon: the famous “drunkard’s walk”, formally termed the random walk. We begin from a very simple case, in a homogeneous and isotropic space on a discrete hypercubic lattice. Then we add traps here and there. Eventually we make a foray into the continuous regime, with the Fokker–Planck diffusion equation (which, we see, is what physicists call a Schrödinger equation in imaginary time), and the stochastic differential Langevin equation.
Here we return to discrete Markov processes, but this time with continuous-time processes. We first consider, study, and solve specific examples such as Poisson processes, divergent birth processes, and birth-and-death processes. We derive the master equations for their probability distributions, and derive and discuss important solutions. In particular, we deepen the theory of Feller for divergent birth processes. In the end we formally study the general case of Markov processes in the stationary case, writing down the forward and the backward Kolmogorov master equations.
Analysis of experimental scalar data is tackled here. Starting from basic analysis of a large number of well-behaved data, eventually displaying Gaussian distributions, we move on to Bayesian inference and face the cases of few (or no) data, sometimes badly behaved. We first present methods to analyze data whose ideal distribution is known, and then we show methods to make predictions even when our ignorance about the data distribution is total. Eventually, various resampling methods are provided to deal with time-correlated measurements, biased estimators, anomalous data, and under- or over-estimation of statistical errors.
As we realize that random walks, chain reactions, and recurrent events are all Markov chains, i.e., correlated processes without memory, in this chapter we derive a general theory, including classification and properties of the single states and of chains. In particular, we focus on the building blocks of the theory, i.e., irreducible chains, presenting and proving a number of fundamental and useful theorems. We end up deriving the balance equation for the limit probability and the approach to the limit for long times, developing and applying the Perron–Frobenius theory for non-negative matrices and the spectral decomposition for non-Hermitian matrices. Among the applications of the theory, we underline the sorting of Web pages by search engines.
Yet another memoryless correlated discrete process is considered, recurrent events. These are classified in different ways, and a whole theory is developed to describe the possible behaviors. Special attention is devoted to the proof of the limit probability theorem, whose lengthy details are reported in an appendix, so as not to scare readers. The theory of recurrent events is particularly useful because many properties and mathematical theorems can be straightforwardly translated in the more general theory of Markov chains.
This chapter is devoted to correlations. We take up the central limit theorem once again, first with a couple of specific examples solved with considerable – but instructive – effort: Markov chains and recurrent events. Then, we generalize the machinery of generating functions to multivariate, correlated systems of stochastic variables, until we are able to prove the central limit theorem and the large deviations theorem for correlated events. We go back to the Markov chain central limit example to show how the theorem massively simplifies things. Eventually, we show how correlations and the lack of a Gaussian central limit are linked to phase transitions in statistical physics.