Skip to main content Accessibility help
Internet Explorer 11 is being discontinued by Microsoft in August 2021. If you have difficulties viewing the site on Internet Explorer 11 we recommend using a different browser such as Microsoft Edge, Google Chrome, Apple Safari or Mozilla Firefox.

Chapter 7: Conditioning by Random Variables

Chapter 7: Conditioning by Random Variables

pp. 261-301

Authors

, Vrije Universiteit, Amsterdam
Resources available Unlock the full potential of this textbook with additional resources. There are Instructor restricted resources available for this textbook. Explore resources
  • Add bookmark
  • Cite
  • Share

Summary

In Chapter 2, conditional probabilities were introduced by conditioning upon the occurrence of an event B of nonzero probability. In applications, this event B is often of the form Y = b for a discrete random variable Y. However, when the random variable Y is continuous, the condition Y = b has probability zero for any number b. In this chapter we will develop techniques for handling a condition provided by the observed value of a continuous random variable. You will see that the conditional probability density function of X given Y = b for continuous random variables is analogous to the conditional probability mass function of X given Y = b for discrete random variables. The conditional distribution of X given Y = b enables us to define the natural concept of conditional expectation of X given Y = b. This concept allows for an intuitive understanding and is of utmost importance. In statistical applications, it is often more convenient to work with conditional expectations instead of the correlation coefficient when measuring the strength of the relationship between two dependent random variables. In applied probability problems, the computation of the expected value of a random variable X is often greatly simplified by conditioning on an appropriately chosen random variable Y. Learning the value of Y provides additional information about the random variable X and for that reason the computation of the conditional expectation of X given Y = b is often simple. Much attention will be paid to the law of conditional probability and the law of conditional expectation. These laws are extremely useful when solving applied probability problems. Among other things, they will be used in the solving of stochastic optimization problems. In the final section we explain Bayesian inference for continuous models and give several statistical applications.

Conditional Distributions

Suppose that the random variables X and Y are defined on the same sample space _ with probability measure P. A basic question for dependent random variables X and Y is: if the observed value of Y is y, what distribution now describes the distribution of X? We first answer this question for the discrete case.

About the book

Access options

Review the options below to login to check your access.

Purchase options

eTextbook
US$42.00
Hardback
US$141.00
Paperback
US$42.00

Have an access code?

To redeem an access code, please log in with your personal login.

If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.

Also available to purchase from these educational ebook suppliers