We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Based on the long-running Probability Theory course at the Sapienza University of Rome, this book offers a fresh and in-depth approach to probability and statistics, while remaining intuitive and accessible in style. The fundamentals of probability theory are elegantly presented, supported by numerous examples and illustrations, and modern applications are later introduced giving readers an appreciation of current research topics. The text covers distribution functions, statistical inference and data analysis, and more advanced methods including Markov chains and Poisson processes, widely used in dynamical systems and data science research. The concluding section, 'Entropy, Probability and Statistical Mechanics' unites key concepts from the text with the authors' impressive research experience, to provide a clear illustration of these powerful statistical tools in action. Ideal for students and researchers in the quantitative sciences this book provides an authoritative account of probability theory, written by leading researchers in the field.
The main ideas are introduced in a historical context. Beginning with phase retrieval and ending with neural networks, the reader will get a sense of the book’s broad scope.
Beginning with linear programming and ending with neural network training, this chapter features seven applications of the divide-and-concur approach to solving problems with RRR.
The reflect-reflect-relax (RRR) algorithm is derived from basic principles. Local convergence is established and the flow limit is introduced to better understand the global behavior.
The size of the intersection of and tells us if we should expect many solutions, or if we should be surprised to find even one. The latter case implies a conspiracy and is the most interesting.
Trapping of the RRR algorithm on nonsolutions can be avoided by modifying the constraint sets and also the metric. This chapter also covers general good practice on the use of RRR.
Projecting to sets and are the elementary operations used by the RRR algorithm to find solutions in their intersection. This chapter covers all the projections that arise in this book.
Whereas RRR has been successfully applied to a broad range of problems, the example in this chapter shows that our understanding of the algorithm is far from complete.
In this chapter, we establish the celebrated Jordan decomposition theorem which allows us to reduce a linear mapping over the complex numbers into a canonical form in terms of its eigenspectrum. As a preparation we first recall some facts regarding factorization of polynomials. Then we show how to reduce a linear mapping over a set of its invariant subspaces determined by a prime factorization of the characteristic polynomial of the mapping. Next we reduce a linear mapping over its generalized eigenspaces. Finally, we prove the Jordan decomposition theorem by understanding how a mapping behaves itself over each of its generalized eigenspaces.