To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many phenomena in physics, chemistry, and biology can be modelled by spatial random processes. One such process is continuum percolation, which is used when the phenomenon being modelled is made up of individual events that overlap, for example, the way individual raindrops eventually make the ground evenly wet. This is a systematic rigorous account of continuum percolation. Two models, the Boolean model and the random connection model, are treated in detail, and related continuum models are discussed. All important techniques and methods are explained and applied to obtain results on the existence of phase transitions, equality and continuity of critical densities, compressions, rarefaction, and other aspects of continuum models. This self-contained treatment, assuming only familiarity with measure theory and basic probability theory, will appeal to students and researchers in probability and stochastic geometry.
Markov chains are an important idea, related to random walks, which crops up widely in applied stochastic analysis. They are used, for example, in performance modelling and evaluation of computer networks, queuing networks, and telecommunication systems. The main point of the present book is to provide methods, based on the construction of Lyapunov functions, of determining when a Markov chain is ergodic, null recurrent, or transient. These methods can also be extended to the study of questions of stability. Of particular concern are reflected random walks and reflected Brownian motion. The authors provide not only a self-contained introduction to the theory but also details of how the required Lyapunov functions are constructed in various situations.
The authors believe that a proper treatment of probability theory requires an adequate background in the theory of finite measures in general spaces. The first part of their book sets out this material in a form that not only provides an introduction for intending specialists in measure theory but also meets the needs of students of probability. The theory of measure and integration is presented for general spaces, with Lebesgue measure and the Lebesgue integral considered as important examples whose special properties are obtained. The introduction to functional analysis which follows covers the material (such as the various notions of convergence) which is relevant to probability theory and also the basic theory of L2-spaces, important in modern physics. The second part of the book is an account of the fundamental theoretical ideas which underlie the applications of probability in statistics and elsewhere, developed from the results obtained in the first part. A large number of examples is included; these form an essential part of the development.
This book provides an introduction to the rapidly expanding theory of stochastic integration and martingales. The treatment is close to that developed by the French school of probabilists, but is more elementary than other texts. The presentation is abstract, but largely self-contained and Dr Kopp makes fewer demands on the reader's background in probability theory than is usual. He gives a fairly full discussion of the measure theory and functional analysis needed for martingale theory, and describes the role of Brownian motion and the Poisson process as paradigm examples in the construction of abstract stochastic integrals. An appendix provides the reader with a glimpse of very recent developments in non-commutative integration theory which are of considerable importance in quantum mechanics. Thus equipped, the reader will have the necessary background to understand research in stochastic analysis. As a textbook, this account will be ideally suited to beginning graduate students in probability theory, and indeed it has evolved from such courses given at Hull University. It should also be of interest to pure mathematicians looking for a careful, yet concise introduction to martingale theory, and to physicists, engineers and economists who are finding that applications to their disciplines are becoming increasingly important.
Random Fields on the Sphere presents a comprehensive analysis of isotropic spherical random fields. The main emphasis is on tools from harmonic analysis, beginning with the representation theory for the group of rotations SO(3). Many recent developments on the method of moments and cumulants for the analysis of Gaussian subordinated fields are reviewed. This background material is used to analyse spectral representations of isotropic spherical random fields and then to investigate in depth the properties of associated harmonic coefficients. Properties and statistical estimation of angular power spectra and polyspectra are addressed in full. The authors are strongly motivated by cosmological applications, especially the analysis of cosmic microwave background (CMB) radiation data, which has initiated a challenging new field of mathematical and statistical research. Ideal for mathematicians and statisticians interested in applications to cosmology, it will also interest cosmologists and mathematicians working in group representations, stochastic calculus and spherical wavelets.
In a first course on probability one typically works with a sequence of random variables X1, X2, … For stochastic processes, instead of indexing the random variables by the positive integers, we index them by t ∈ [0, ∞) and we think of Xt as being the value at time t. The random variable could be the location of a particle on the real line, the strength of a signal, the price of a stock, and many other possibilities as well.
We will also work with increasing families of σ-fields {ℱt}, known as filtrations. The σ-field ℱt is supposed to represent what we know up to time t.
Processes and σ-fields
Let (Ω, ℱ, ℙ) be a probability space. A real-valued stochastic process (or simply a process) is a map X from [0,∞) × Ω to the reals. We write Xt = Xt(ω) = X (t, ω). We will impose stronger measurability conditions shortly, but for now we require that the random variables Xt be measurable with respect to ℱ for each t ≥ 0.
A collection of σ-fields ℱt such that ℱt ⊂ ℱ for each t and ℱs ⊂ ℱt if s ≤ t is called a filtration. Define ℱt+ = ∩∈ > 0ℱt+∈. A filtration is right continuous if ℱt+ = ℱt for all t ≥ 0. The σ-field ℱt+ is supposed to represent what one knows if one looks ahead an infinitesimal amount. Most of the filtrations we will come across will be right continuous, but see Exercise 1.1.