To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Noise is an important aspect of what limits the performance of communication systems. As such, it is important to understand the statistical properties of noise. Noise at the input of a receiver will affect the performance of a communication system. The received signal consists of the desired signal plus noise. Because receivers filter the received signal, it is important to be able to characterize the noise out of a linear system (i.e. a filter).
This chapter details how to work on several manifolds of practical interest, focusing on embedded submanifolds of linear spaces. It provides two tables which point to Manopt implementations of those manifolds, and to the various places in the book where it is explained how to work with products of manifolds. The manifolds detailed in this chapter include Euclidean spaces, unit spheres, the Stiefel manifold (orthonormal matrices), the orthogonal group and associated group of rotations, the manifold of matrices with a given size and rank and hyperbolic space in the hyperboloid model. It further discusses geometric tools for optimization on a manifold defined by (regular) constraints $h(x) = 0$ in general. That last section notably makes it possible to connect concepts from Riemannian optimization with classical concepts from constrained optimization in linear spaces, namely, Lagrange multipliers and KKT conditions under linear independence constraint qualifications (LICQ).
The main purpose of this chapter is to define and analyze Riemannian gradient descent methods. This family of algorithms aims to minimize real-valued functions (called cost functions) on manifolds. They apply to general manifolds, hence in particular also to embedded submanifolds of linear spaces. The previous chapter provides all necessary geometric tools for that setting. The initial technical steps involve constructing first-order Taylor expansions of the cost function along smooth curves, and identifying necessary optimality conditions (at a solution, the Riemannian gradient must vanish). Then, the chapter presents the algorithm and proposes a worst-case iteration complexity analysis. The main conclusion is that, under a Lipschitz-type assumption on the gradient of the cost function composed with the retraction, the algorithm finds a point with gradient smaller than $\varepsilon$ in at most a multiple of $\varepsilon^2$ iterations. The chapter ends with three optional sections: They discuss local convergence rates, detail how to compute gradients in practice and describe how to check that a gradient is correctly implemented.
These modulation techniques are widely used in practice. We quantify the trade-off between data rate and energy for these techniques and compare performance with the capacity limits discussed in Chapter 1. We begin by discussing MPSK, where the information determines the phase of a sinusoidal signal. Second, we discuss PAM and QAM, in which the amplitude in one and two dimensions, respectively, are varied depending on the data. Third, we discuss orthogonal modulation in which the bandwidth efficiency is very low, but the required energy is also very low.
In this chapter we first discuss the relationship between transmitted signals and received signals. There are three effects of the propagation medium on the transmitted signals: path loss, shadowing, and multipath fading. Path loss refers to the relation between the average received power and the transmitted power as a function of distance. Shadowing refers to the situation where buildings or other objects might block the line of sight between the transmitter and receiver.
The optimization algorithms from Chapters 4 and 6 require only rather simple tools from Riemannian geometry, all covered in Chapters 3 and 5 for embedded submanifolds then generalized in Chapter 8. This chapter provides additional geometric tools to gain deeper insight and help develop more sophisticated algorithms. It opens with the Riemannian distance then discusses exponential maps as retractions which generate geodesics. This is paired with a careful discussion of what it means to invert the exponential map. Then, the chapter defines parallel transport to compare tangent vectors in different tangent spaces. Later, the chapter defines transporters which can been seen as a relaxed type of parallel transport. Before that, we take a deep dive into the notion of Lipschitz continuity for gradients and Hessians on Riemannian manifolds, aiming to connect these concepts with the Lipschitz-type regularity assumptions we required to analyze gradient descent and trust regions. The chapter closes with a discussion of how to approximate Riemannian Hessians with finite differences of gradients via transporters, and with an introduction to the differentiation of tensor fields of all orders.