To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Some QKD protocols, as I will detail in Chapter 11, produce Gaussian key elements. The reconciliation methods of the previous chapter are, as such, not adapted to this case. In this chapter, we build upon the previous techniques to treat the case of continuous-variable key elements or, more generally, the case of non-binary key elements.
In the first two sections, I describe two techniques to process non-binary key elements, namely sliced error correction and multistage soft decoding. Then, I conclude the chapter by giving more specific details on the reconciliation of Gaussian key elements.
Sliced error correction
Sliced error correction (SEC) is a generic reconciliation protocol that corrects strings of non-binary elements using binary reconciliation protocols as primitives [173]. The purpose of sliced error correction is to start from a list of correlated values and to give, with high probability, equal binary strings to Claude and Dominique. The underlying idea is to convert Claude's and Dominique's values into strings of bits, to apply a binary correction protocol (BCP) on each of them and to take advantage of all available information to minimize the number of exchanged reconciliation messages. It enables Claude and Dominique to reconcile a wide variety of correlated variables X and Y while relying on BCPs that are optimized to correct errors modeled by a binary symmetric channel (BSC).
An important application of sliced error correction is to correct correlated Gaussian random variables, namely X ∼ N(0, Σ) and Y = X + ∈ with ∈ ∼ N(0, σ). This important particular case is needed for QKD protocols that use a Gaussian modulation of Gaussian states, as described in Chapter 11.
This chapter develops more tools for working with random variables. The probability generating function is the key tool for working with sums of nonnegative integer-valued random variables that are independent. When random variables are only uncorrelated, we can work with averages (normalized sums) by using the weak law of large numbers. We emphasize that the weak law makes the connection between probability theory and the every-day practice of using averages of observations to estimate probabilities of real-world measurements. The last two sections introduce conditional probability and conditional expectation. The three important tools here are the law of total probability, the law of substitution, and, for independent random variables, “dropping the conditioning.”
The foregoing concepts are developed here for discrete random variables, but they will all be extended to more general settings in later chapters.
Probability generating functions
In many problems we have a sum of independent random variables, and we would like to know the probability mass function of their sum. For example, in an optical communication system, the received signal might be Y = X + W, where X is the number of photoelectrons due to incident light on a photodetector, and W is the number of electrons due to dark current noise in the detector. An important tool for solving these kinds of problems is the probability generating function. The name derives from the fact that it can be used to compute the probability mass function.
Why do electrical and computer engineers need to study probability?
Probability theory provides powerful tools to explain, model, analyze, and design technology developed by electrical and computer engineers. Here are a few applications.
Signal processing. My own interest in the subject arose when I was an undergraduate taking the required course in probability for electrical engineers. We considered the situation shown in Figure 1.1. To determine the presence of an aircraft, a known radar pulse v(t) is sent out. If there are no objects in range of the radar, the radar's amplifiers produce only a noise waveform, denoted by Xt. If there is an object in range, the reflected radar pulse plus noise is produced. The overall goal is to decide whether the received waveform is noise only or signal plus noise. To get an idea of how difficult this can be, consider the signal plus noise waveform shown at the top in Figure 1.2. Our class addressed the subproblem of designing an optimal linear system to process the received waveform so as to make the presence of the signal more obvious. We learned that the optimal transfer function is given by the matched filter. If the signal at the top in Figure 1.2 is processed by the appropriate matched filter, we get the output shown at the bottom in Figure 1.2. You will study the matched filter in Chapter 10.
In Chapters 2 and 3, the only random variables we considered specifically were discrete ones such as the Bernoulli, binomial, Poisson, and geometric. In this chapter we consider a class of random variables allowed to take a continuum of values. These random variables are called continuous random variables and are introduced in Section 4.1. Continuous random variables are important models for integrator output voltages in communication receivers, file download times on the Internet, velocity and position of an airliner on radar, etc. Expectation and moments of continuous random variables are computed in Section 4.2. Section 4.3 develops the concepts of moment generating function (Laplace transform) and characteristic function (Fourier transform). In Section 4.4 expectation of multiple random variables is considered. Applications of characteristic functions to sums of independent random variables are illustrated. In Section 4.5 the Markov inequality, the Chebyshev inequality, and the Chernoff bound illustrate simple techniques for bounding probabilities in terms of expectations.
Densities and probabilities
Introduction
Suppose that a random voltage in the range [0,1) is applied to a voltmeter with a one-digit display. Then the display output can be modeled by a discrete random variable Y taking values .0, .1, .2, …, .9 with P(Y = k/10) = 1/10 for k = 0, …, 9.
This book is a primary text for graduate-level courses in probability and random processes that are typically offered in electrical and computer engineering departments. The text starts from first principles and contains more than enough material for a two-semester sequence. The level of the text varies from advanced undergraduate to graduate as the material progresses. The principal prerequisite is the usual undergraduate electrical and computer engineering course on signals and systems, e.g., Haykin and Van Veen or Oppenheim and Willsky (see the Bibliography at the end of the book). However, later chapters that deal with random vectors assume some familiarity with linear algebra; e.g., determinants and matrix inverses.
How to use the book
A first course. In a course that assumes at most a modest background in probability, the core of the offering would include Chapters 1–5 and 7. These cover the basics of probability and discrete and continuous random variables. As the chapter dependencies graph on the preceding page indicates, there is considerable flexibility in the selection and ordering of additional material as the instructor sees fit.
A second course. In a course that assumes a solid background in the basics of probability and discrete and continuous random variables, the material in Chapters 1–5 and 7 can be reviewed quickly.