To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Stochastic processes with jumps and random measures are importance as drivers in applications like financial mathematics and signal processing. This 2002 text develops stochastic integration theory for both integrators (semimartingales) and random measures from a common point of view. Using some novel predictable controlling devices, the author furnishes the theory of stochastic differential equations driven by them, as well as their stability and numerical approximation theories. Highlights feature DCT and Egoroff's Theorem, as well as comprehensive analogs results from ordinary integration theory, for instance previsible envelopes and an algorithm computing stochastic integrals of càglàd integrands pathwise. Full proofs are given for all results, and motivation is stressed throughout. A large appendix contains most of the analysis that readers will need as a prerequisite. This will be an invaluable reference for graduate students and researchers in mathematics, physics, electrical engineering and finance who need to use stochastic differential equations.
This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail.
This book contains a rigorous mathematical treatment of the geometrical aspects of sets of both integral and fractional Hausdorff dimension. Questions of local density and the existence of tangents of such sets are studied, as well as the dimensional properties of their projections in various directions. In the case of sets of integral dimension the dramatic differences between regular 'curve-like' sets and irregular 'dust like' sets are exhibited. The theory is related by duality to Kayeka sets (sets of zero area containing lines in every direction). The final chapter includes diverse examples of sets to which the general theory is applicable: discussions of curves of fractional dimension, self-similar sets, strange attractors, and examples from number theory, convexity and so on. There is an emphasis on the basic tools of the subject such as the Vitali covering lemma, net measures and Fourier transform methods.
The notion of 'stopping times' is a useful one in probability theory; it can be applied to both classical problems and fresh ones. This book presents this technique in the context of the directed set, stochastic processes indexed by directed sets, and many applications in probability, analysis and ergodic theory. Martingales and related processes are considered from several points of view. The book opens with a discussion of pointwise and stochastic convergence of processes, with concise proofs arising from the method of stochastic convergence. Later, the rewording of Vitali covering conditions in terms of stopping times clarifies connections with the theory of stochastic processes. Solutions are presented here for nearly all the open problems in the Krickeberg convergence theory for martingales and submartingales indexed by directed set. Another theme of the book is the unification of martingale and ergodic theorems.
This book was first published in 2003. Derived from extensive teaching experience in Paris, this book presents around 100 exercises in probability. The exercises cover measure theory and probability, independence and conditioning, Gaussian variables, distributional computations, convergence of random variables, and random processes. For each exercise the authors have provided detailed solutions as well as references for preliminary and further reading. There are also many insightful notes to motivate the student and set the exercises in context. Students will find these exercises extremely useful for easing the transition between simple and complex probabilistic frameworks. Indeed, many of the exercises here will lead the student on to frontier research topics in probability. Along the way, attention is drawn to a number of traps into which students of probability often fall. This book is ideal for independent study or as the companion to a course in advanced probability theory.
This text is designed both for students of probability and stochastic processes, and for students of functional analysis. For the reader not familiar with functional analysis a detailed introduction to necessary notions and facts is provided. However, this is not a straight textbook in functional analysis; rather, it presents some chosen parts of functional analysis that can help understand ideas from probability and stochastic processes. The subjects range from basic Hilbert and Banach spaces, through weak topologies and Banach algebras, to the theory of semigroups of bounded linear operators. Numerous standard and non-standard examples and exercises make the book suitable as a course textbook or for self-study.
This book is based on a course given at Massachusetts Institute of Technology. It is intended to be a reasonably self-contained introduction to stochastic analytic techniques that can be used in the study of certain problems. The central theme is the theory of diffusions. In order to emphasize the intuitive aspects of probabilistic techniques, diffusion theory is presented as a natural generalization of the flow generated by a vector field. Essential to the development of this idea is the introduction of martingales and the formulation of diffusion theory in terms of martingales. The book will make valuable reading for advanced students in probability theory and analysis and will be welcomed as a concise account of the subject by research workers in these fields.
In recent years the application of random matrix techniques to analytic number theory has been responsible for major advances in this area of mathematics. As a consequence it has created a new and rapidly developing area of research. The aim of this book is to provide the necessary grounding both in relevant aspects of number theory and techniques of random matrix theory, as well as to inform the reader of what progress has been made when these two apparently disparate subjects meet. This volume of proceedings is addressed to graduate students and other researchers in both pure mathematics and theoretical physics. The contributing authors, who are among the world leading experts in this area, have taken care to write self-contained lectures on subjects chosen to produce a coherent volume.
In this fully revised second edition of Understanding Probability, the reader can learn about the world of probability in an informal way. The author demystifies the law of large numbers, betting systems, random walks, the bootstrap, rare events, the central limit theorem, the Bayesian approach and more. This second edition has wider coverage, more explanations and examples and exercises, and a new chapter introducing Markov chains, making it a great choice for a first probability course. But its easy-going style makes it just as valuable if you want to learn about the subject on your own, and high school algebra is really all the mathematical background you need.
Fragmentation and coagulation are two natural phenomena that can be observed in many sciences and at a great variety of scales - from, for example, DNA fragmentation to formation of planets by accretion. This book, by the author of the acclaimed Lévy Processes, is the first comprehensive theoretical account of mathematical models for situations where either phenomenon occurs randomly and repeatedly as time passes. This self-contained treatment develops the models in a way that makes recent developments in the field accessible. Each chapter ends with a comments section in which important aspects not discussed in the main part of the text (often because the discussion would have been too technical and/or lengthy) are addressed and precise references are given. Written for readers with a solid background in probability, its careful exposition allows graduate students, as well as working mathematicians, to approach the material with confidence.
In this book, Professor Pinsky gives a self-contained account of the theory of positive harmonic functions for second order elliptic operators, using an integrated probabilistic and analytic approach. The book begins with a treatment of the construction and basic properties of diffusion processes. This theory then serves as a vehicle for studying positive harmonic funtions. Starting with a rigorous treatment of the spectral theory of elliptic operators with nice coefficients on smooth, bounded domains, the author then develops the theory of the generalized principal eigenvalue, and the related criticality theory for elliptic operators on arbitrary domains. Martin boundary theory is considered, and the Martin boundary is explicitly calculated for several classes of operators. The book provides an array of criteria for determining whether a diffusion process is transient or recurrent. Also introduced are the theory of bounded harmonic functions, and Brownian motion on manifolds of negative curvature. Many results that form the folklore of the subject are here given a rigorous exposition, making this book a useful reference for the specialist, and an excellent guide for the graduate student.
Citing D. Voiculescu, “Around 1982, I realized that the right way to look at certain operator algebra problems was by imitating some basic probability theory. More precisely, in noncommutative probability theory a new kind of independence can be defined by replacing tensor products with free products and this can help understand the von Neumann algebras of free groups. The subject has evolved into a kind of parallel to basic probability theory, which should be called free probability theory.”
Thus, Voiculescu's first motivation to introduce free probability was the analysis of the von Neumann algebras of free groups. One of his central observations was that such groups can be equipped with tracial states (also called traces), which resemble expectations in classical probability, whereas the property of freeness, once properly stated, can be seen as a notion similar to independence in classical probability. This led him to the statement
free probability theory=noncommutative probability theory+ free independence.
These two components are the basis for a probability theory for noncommutative variables where many concepts taken from probability theory such as the notions of laws, convergence in law, independence, central limit theorem, Brownian motion, entropy and more can be naturally defined. For instance, the law of one self-adjoint variable is simply given by the traces of its powers (which generalizes the definition through moments of compactly supported probability measures on the real line), and the joint law of several self-adjoint noncommutative variables is defined by the collection of traces of words in these variables.
The study of random matrices, and in particular the properties of their eigenvalues, has emerged from the applications, first in data analysis and later as statistical models for heavy-nuclei atoms. Thus, the field of random matrices owes its existence to applications. Over the years, however, it became clear that models related to random matrices play an important role in areas of pure mathematics. Moreover, the tools used in the study of random matrices came themselves from different and seemingly unrelated branches of mathematics.
At this point in time, the topic has evolved enough that the newcomer, especially if coming from the field of probability theory, faces a formidable and somewhat confusing task in trying to access the research literature. Furthermore, the background expected of such a newcomer is diverse, and often has to be supplemented before a serious study of random matrices can begin.
We believe that many parts of the field of random matrices are now developed enough to enable one to expose the basic ideas in a systematic and coherent way. Indeed, such a treatise, geared toward theoretical physicists, has existed for some time, in the form of Mehta's superb book [Meh91]. Our goal in writing this book has been to present a rigorous introduction to the basic theory of random matrices, including free probability, that is sufficiently self-contained to be accessible to graduate students in mathematics or related sciences who have mastered probability theory at the graduate level, but have not necessarily been exposed to advanced notions of functional analysis, algebra or geometry.