THE END OF the previous chapter mentions that simulation has greatly expanded the scope of Bayesian inference. This chapter reviews methods for generating independent samples from probability distributions. The methods discussed here form the basis for the newer methods discussed in Chapter 7 that are capable of dealing with a wide variety of distributions but do not generate independent samples.
All major statistics packages contain routines for generating random variables from such standard distributions as those summarized in Appendix A. The following examples are intended to illustrate methods of generating samples. I do not claim that the algorithms are the best that can be designed, and you should not study the methods in great detail. The goal for the chapter is to present the standard techniques of simulation and explain the kinds of questions that simulated samples can help answer.
Many of the applications discussed can be regarded as attempts to approximate a quantity such as E[g(X)] where X ∼ f(X), but the necessary integral, ∫ g(x)f(x)dx, cannot be computed analytically. This problem includes the computation of expected values (where g(X) = X) and other moments, as well as P(c1 ≤ X ≤ c2), for which you set g(X) = 1(c1 ≤ X ≤ c2).
Probability Integral Transformation Method
The most basic method of generating samples takes advantage of the ability of computers to generate values that can be regarded as drawn independently from a uniform distribution on (0,1), U(0, 1).
Review the options below to login to check your access.
Log in with your Cambridge Aspire website account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.