In Chapters 2 and 3, the only random variables we considered specifically were discrete ones such as the Bernoulli, binomial, Poisson, and geometric. In this chapter we consider a class of random variables allowed to take a continuum of values. These random variables are called continuous random variables and are introduced in Section 4.1. Continuous random variables are important models for integrator output voltages in communication receivers, file download times on the Internet, velocity and position of an airliner on radar, etc. Expectation and moments of continuous random variables are computed in Section 4.2. Section 4.3 develops the concepts of moment generating function (Laplace transform) and characteristic function (Fourier transform). In Section 4.4 expectation of multiple random variables is considered. Applications of characteristic functions to sums of independent random variables are illustrated. In Section 4.5 the Markov inequality, the Chebyshev inequality, and the Chernoff bound illustrate simple techniques for bounding probabilities in terms of expectations.
Densities and probabilities
Introduction
Suppose that a random voltage in the range [0,1) is applied to a voltmeter with a one-digit display. Then the display output can be modeled by a discrete random variable Y taking values .0, .1, .2, …, .9 with P(Y = k/10) = 1/10 for k = 0, …, 9.