To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This book offers a mathematical foundation for modern cryptography. It is primarily intended as an introduction for graduate students. Readers should have basic knowledge of probability theory, but familiarity with computational complexity is not required. Starting from Shannon's classic result on secret key cryptography, fundamental topics of cryptography, such as secret key agreement, authentication, secret sharing, and secure computation, are covered. Particular attention is drawn to how correlated randomness can be used to construct cryptographic primitives. To evaluate the efficiency of such constructions, information-theoretic tools, such as smooth min/max entropies and information spectrum, are developed. The broad coverage means the book will also be useful to experts as well as students in cryptography as a reference for information-theoretic concepts and tools.
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This second volume, Inference, builds on the foundational topics established in volume I to introduce students to techniques for inferring unknown variables and quantities, including Bayesian inference, Monte Carlo Markov Chain methods, maximum-likelihood estimation, hidden Markov models, Bayesian networks, and reinforcement learning. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 350 end-of-chapter problems (including solutions for instructors), 180 solved examples, almost 200 figures, datasets and downloadable Matlab code. Supported by sister volumes Foundations and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.
Optimization problems often have symmetries. For example, the value of the cost function may not change if its input vectors are scaled, translated or rotated. Then, it makes sense to quotient out the symmetries. If the quotient space is a manifold, it is called a quotient manifold. This often happens when the symmetries result from invariance to group actions: This chapter first reviews conditions for this to happen. Continuing with general quotient manifolds, the chapter reviews geometric concepts (points, tangent vectors, vector fields, retractions, Riemannian metrics, gradients, connections, Hessians and acceleration) to show how to work numerically with these abstract objects through lifts. The chapter aims to show the reader what it means to optimize on a quotient manifold, and how to do so on a computer. To this end, two important sections detail the relation between running Riemannian gradient descent and Newton’s method on the quotient manifold compared to running them on the non-quotiented manifold (called the total space). The running example is the Grassmann manifold as a quotient of the Stiefel manifold. Its tools are summarized in a closing section.
This chapter provides the classical definitions for manifolds (via charts, which have not appeared thus far), smooth maps to and from manifolds, tangent vectors and tangent spaces and differentials of smooth maps. Special care is taken to introduce the atlas topology on manifolds and to justify topological restrictions in the definition of a manifold. It is shown explicitly that embedded submanifolds of linear spaces as detailed in earlier chapters are manifolds in the general sense. Then, sections go on to explain how the geometric concepts introduced for embedded submanifolds extend to general manifolds mostly without effort. This includes tangent bundles, vector fields, retractions, local frames, Riemannian metrics, gradients, connections, Hessians, velocity and acceleration, geodesics and Taylor expansions. One section explains why the Lie bracket of two vector fields can be interpreted as a vector field (which we omitted in Chapter 5). The chapter closes with a section about submanifolds embedded in general manifolds (rather than only in linear spaces): This is useful in preparation for the next chapter.
To design more sophisticated optimization algorithms, we need more refined geometric tools. In particular, to define the Hessian of a cost function, we need a means to differentiate the gradient vector field. This chapter highlights why this requires care, then proceeds to define connections: the proper concept from differential geometry for this task. The proposed definition is stated somewhat differently from the usual: An optional section details why they are equivalent. Riemannian manifolds have a privileged connection called the Riemannian connection, which is used to define Riemannian Hessians. The same concept is used to differentiate vector fields along curves. Applied to the velocity vector field of a curve, this yields the notion of intrinsic acceleration; geodesics are the curves with zero intrinsic acceleration. The tools built in this chapter naturally lead to second-order Taylor expansions of cost functions along curves. These then motivate the definition of second-order retractions. Two optional closing sections further consider the important special case of Hessians on Riemannian submanifolds, and an intuitive way to build second-order retractions by projection.