THE PREVIOUS CHAPTER discussed methods that generate independent observations from standard probability distributions. But you still have the problem of what to do when faced with a nonstandard distribution such as the posterior distribution of parameters of the conditionally conjugate linear regression model. Although themethods previously described can, in principle, deal with nonstandard distributions, doing so presents major practical difficulties. In particular, they are not easy to implement in the multivariate case, and finding a suitable importance function for the importance sampling algorithm or a majorizing density for the AR algorithm may require a very large investment of time whenever a new nonstandard distribution is encountered.
These considerations impeded the progress of Bayesian statistics until the development of Markov chain Monte Carlo (MCMC) simulation, a method that became known and available to statisticians in the early 1990s. MCMC methods have proven extremely effective and have greatly increased the scope of Bayesian methods. Although a disadvantage of these methods is that they do not provide independent samples, they have the great advantage of flexibility: they can be implemented for a great variety of distributions without having to undertake an intensive analysis of the special features of the distribution. Note, however, that an analysis of the distribution may shed light on the best algorithm to use when more than one is available.
Because these methods rely on Markov chains, a type of stochastic process, this chapter presents some basic concepts of the theory, and the next chapter utilizes these concepts to explain MCMC methods.
Review the options below to login to check your access.
Log in with your Cambridge Aspire website account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.