Book contents
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
4 - Continuous random variables
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
Summary
In Chapters 2 and 3, the only random variables we considered specifically were discrete ones such as the Bernoulli, binomial, Poisson, and geometric. In this chapter we consider a class of random variables allowed to take a continuum of values. These random variables are called continuous random variables and are introduced in Section 4.1. Continuous random variables are important models for integrator output voltages in communication receivers, file download times on the Internet, velocity and position of an airliner on radar, etc. Expectation and moments of continuous random variables are computed in Section 4.2. Section 4.3 develops the concepts of moment generating function (Laplace transform) and characteristic function (Fourier transform). In Section 4.4 expectation of multiple random variables is considered. Applications of characteristic functions to sums of independent random variables are illustrated. In Section 4.5 the Markov inequality, the Chebyshev inequality, and the Chernoff bound illustrate simple techniques for bounding probabilities in terms of expectations.
Densities and probabilities
Introduction
Suppose that a random voltage in the range [0,1) is applied to a voltmeter with a one-digit display. Then the display output can be modeled by a discrete random variable Y taking values .0, .1, .2, …, .9 with P(Y = k/10) = 1/10 for k = 0, …, 9.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2006
- 2
- Cited by