Book contents
- Frontmatter
- Dedication
- Contents
- List of Illustrations
- List of Tables
- List of Contributors
- Preface
- Part I Introduction to Modeling
- Part II Parameter Estimation
- 3 Basic Parameter Estimation Techniques
- 4 Maximum Likelihood Parameter Estimation
- 5 Combining Information from Multiple Participants
- 6 Bayesian Parameter Estimation
- 7 Bayesian Parameter Estimation
- 8 Bayesian Parameter Estimation
- 9 Multilevel or Hierarchical Modeling
- Part III Model Comparison
- Part IV Models in Psychology
- Appendix A Greek Symbols
- Appendix B Mathematical Terminology
- References
- Index
6 - Bayesian Parameter Estimation
from Part II - Parameter Estimation
Published online by Cambridge University Press: 05 February 2018
- Frontmatter
- Dedication
- Contents
- List of Illustrations
- List of Tables
- List of Contributors
- Preface
- Part I Introduction to Modeling
- Part II Parameter Estimation
- 3 Basic Parameter Estimation Techniques
- 4 Maximum Likelihood Parameter Estimation
- 5 Combining Information from Multiple Participants
- 6 Bayesian Parameter Estimation
- 7 Bayesian Parameter Estimation
- 8 Bayesian Parameter Estimation
- 9 Multilevel or Hierarchical Modeling
- Part III Model Comparison
- Part IV Models in Psychology
- Appendix A Greek Symbols
- Appendix B Mathematical Terminology
- References
- Index
Summary
The goal of this chapter is to give the reader a thorough understanding of the principles of Bayesian Parameter Estimation and its application using analytic and numerical methods. Readers interested in the broader background of Bayesian statistics may wish to consult the books by Kruschke (2011), Gelman et al. (2004), or Jaynes (2003), to mention but a few that we find particularly helpful.
What Is Bayesian Inference?
In Chapter 4 we introduced the likelihood function as a means of identifying the most likely value of a parameter in light of the observed data. We also cautioned against confusing the likelihood with a probability. Likelihoods permit relative comparisons between different parameter values – thus allowing us to maximize likelihoods in order to obtain parameter estimates – but they are not suited for estimating absolute probabilities.
Some of the most intuitively obvious questions one might ask of parameter estimates therefore require an approach other than maximum likelihood estimation. Suppose we estimate a parameter M from a given data set that we believe represents the capacity of working memory; that is, how many items people can hold in mind at the same time in the face of distraction (e.g., Kane et al., 2005).We need not worry about how exactly this was done, but let's suppose the best estimate turns out to be 3.2. That punctate information by itself tells us relatively little because we know that however cleverly we designed our experiment, there would be some measurement error associated with our single estimate. What we really want to know is the likely range of the “true” parameter value that we can infer from our measurement. Ideally, we want to have information about the probability distribution of that parameter so we can draw more educated conclusions. This requires the use of Bayesian parameter estimation, and we devote the next four chapters to an exploration of Bayesian concepts. We begin by introducing some additional basic facts about conditional probabilities.
Analytic Methods for Obtaining Posteriors Our first example of Bayesian inference involves multiple flips of a slightly biased coin. Our example actually involves a natural process but we will not reveal that process until later. Our goal is to identify exactly how biased the “coin” is, and we apply Bayes Theorem to estimate the parameter that characterizes the coin's behavior.
- Type
- Chapter
- Information
- Computational Modeling of Cognition and Behavior , pp. 126 - 145Publisher: Cambridge University PressPrint publication year: 2018