18 results
37 - Assessing Uncertainty in Physical Constants
- from PART THREE - REAL-WORLD APPLICATIONS
-
- By Max Henrion, Decision Laboratory Ask Jeeves!, Baruch Fischhoff, Department of Social & Decision Sciences Carnegie Mellon University
- Edited by Thomas Gilovich, Cornell University, New York, Dale Griffin, Stanford University, California, Daniel Kahneman, Princeton University, New Jersey
-
- Book:
- Heuristics and Biases
- Published online:
- 05 June 2012
- Print publication:
- 08 July 2002, pp 666-677
-
- Chapter
- Export citation
-
Summary
Accurate estimates of the fundamental constants of physics, such as the velocity of light or the rest mass of the electron, are central to the enterprise of science (Pipkin & Ritter, 1983). Like any measurements, they are subject to uncertainties from a variety of sources. Reliable assessments of this uncertainty are needed (1) to compare the precision of different measurements of the same quantity; (2) to assess the accuracy of other quantities derived from them; and, most crucially, (3) to evaluate the consistency of physical theory with the current best measurements. Thus, as Eisenhart pointed out, “A reported value whose accuracy is entirely unknown is worthless” (1968, p. 1201).
It is not unusual to encounter individual examples of errors in measurements of physical quantities that turn out to be disconcertingly large relative to the estimated uncertainty. One well-known case was in Millikan's oil-drop experiment in 1912 to determine e, whose result turned out 15 years later to be off by 0.6%, or three standard deviations, due to reliance on a faulty value for the viscosity of air (Cohen, Crowe, & Dumond, 1957). A more recent example concerns measurements of |v+-|, the parameter that measures the degree of violation of CP (charge–conjugation–parity) invariance. The six measurements prior to 1973 agreed reasonably, but more accurate measurements since then differ consistently by about seven standard deviations from the pre-1973 mean, a discrepancy that remains unexplained in terms of experimental procedure (Franklin, 1984).
10 - Analytical A Software Tool for Uncertainty Analysis and Model Communication
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 257-288
-
- Chapter
- Export citation
-
Summary
“The real value of computers is for communication, not computation.”
Natasha KalatinIt goes almost without saying that doing quantitative analysis means creating computer models, especially if you want to treat the uncertainty explicitly. It is possible to do back-of-the-envelope calculations for a handful of variables with pencil and paper, using simple error-propagation methods (Section 8.3.5) or probability trees (Section 8.4). Indeed, that is an excellent way to develop your intuitions when you start a project. Any serious uncertainty analysis, however, requires a computer. Fortunately, there are several commercially available software products that make the probabilistic treatment of uncertainty quite straightforward. These tools obviate the need for the analyst to master the intricacies of implementing complicated Monte Carlo codes.
How should we choose and use software for quantitative modeling and risk analysis? At first blush, you might think that quantitative modeling — especially when it treats uncertainty — is primarily a matter of number crunching. Certainly, that is a critical part of it; but if the ultimate purpose of the computation is improved insight into complicated situations, and enhanced understanding of the risks and opportunities, then modeling comprises a great deal more. The treatment of uncertainty (the primary focus of this book) is only one of many objectives.
In Chapter 3, we discussed the goals and motivations for risk and policy analysis, culminating in our suggested “ten commandments for good policy analysis” (Section 3.8).
11 - Large and Complex Models
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 289-306
-
- Chapter
- Export citation
-
Summary
“A theory should be as simple as possible, but no simpler.”
Albert EinsteinMost of the best policy models are small and simple. At least in their essentials, they can be easily understood and described to others. They present relatively modest computational demands for modern digital computers. These attributes flow directly from the requirements that to be useful, policy models must be understandable and modest enough to be vigorously exercised to explore the implications of alternative assumptions and polices.
Of course, many models are not small and simple. There are some models, especially some science and engineering models, that are large or complex because they need to be. But many more are large or complex because their authors gave too little thought to why and how they were being built and how they would be used.
Policy analysts occasionally find themselves confronted with large and complex models. Sometimes the problem is to use the results or insights gained from a large science or engineering model in a policy analysis. More often the problem involves large and complex models someone wants to use directly in policy analysis. This chapter briefly explores some of the special issues that arise in such situations.
What Are “Large” and “Complex” Models?
Large models are models requiring large amounts of human, computational, or other resources in their construction and operation.
1 - Introduction
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 1-5
-
- Chapter
- Export citation
-
Summary
To know one's ignorance is the best part of knowledge.
Lao Tzu, The Tao no. 71Life is full of uncertainties. Most of us have learned to live comfortably with day-to-day uncertainties and to make choices and decisions in their presence. We have evolved cognitive heuristics and developed strategies, technologies, and institutions such as weather reports, pocket-sized raincoats, and insurance to accommodate or compensate for the effects of uncertainty. Looked at with care, these heuristics and strategies do not always perform as well as we would like (Dawes, 1988). When our cognitive processes for dealing with uncertainty introduce error or bias into our judgments we are often unable to detect the fact. When things go seriously wrong we may not be around to learn the lesson – or we may still be unable to detect that the problem came from faulty processing of uncertain information. Thus, we muddle through – often doing quite well, occasionally getting into serious trouble.
Of course, uncertainty is not limited to our private lives. It also occurs in larger and more public situations. Frequently in public discussion, policy analysis, regulatory decision making and other contexts, we proceed as if we understand and can predict the world precisely. While a moment's reflection is sufficient to persuade anyone that this is not true, a number of political, behavioral, and analytical factors combine to promote the continuation of this practice.
Preface
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp xi-xii
-
- Chapter
- Export citation
-
Summary
To people trained in the physical sciences dealing with uncertainty is almost second nature. Serious physical scientists would not report an experimental result without an associated estimate of uncertainty, nor would they design a new experiment without giving careful attention to uncertainties. Thus, when we first left the physical sciences and started doing policy analysis, we simply did what came naturally. We worried about uncertainty. It didn't take long to discover that only a handful of other analysts (most of them trained in decision analysis or engineering) shared our concern. It took only a little longer to understand that, with its enormous inherent uncertainties, the field of policy analysis needs to be concerned with the characterization and analysis of uncertainty to an extent that far exceeds the need in the physical sciences. We set out to do such analysis, and along the way to build computer-based tools which would make the job easier. In the process, we discovered that what really matters is the training and philosophy of the analysts who use these tools. Through the doctoral program in the Department of Engineering and Public Policy at Carnegie Mellon University we have invested heavily in such education.
Today, while the adequate treatment of uncertainty in policy analysis is still the exception, not the rule, the exceptions are becoming more frequent.
Index
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 325-332
-
- Chapter
- Export citation
Contents
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp v-x
-
- Chapter
- Export citation
4 - The Nature and Sources of Uncertainty
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 47-72
-
- Chapter
- Export citation
-
Summary
Probability does not exist.
Bruno de Finetti, preface,
The Theory of ProbabilityIntroduction
“Uncertainty” is a capacious term, used to encompass a multiplicity of concepts. Uncertainty may arise because of incomplete information – what will be the U. S. defense budget in the year 2050? – or because of disagreement between information sources – what was the 1987 Soviet defense budget? Uncertainty may arise from linguistic imprecision – what exactly is meant by “The river is wide”? It may refer to variability – what is the flow rate of the Ohio River? Uncertainty may be about a quantity – the slope of a linear dose-response function – or about the structure of a model – the shape of a dose-response function. Even where we have complete information in principle, we may be uncertain because of simplifications and approximations introduced to make analyzing the information cognitively or computationally more tractable. As well as being uncertain about what is the case in the external world, we may be uncertain about what we like, that is about our preferences, and uncertain about what to do about it, that is, about our decisions. Very possibly, we may even be uncertain about our degree of uncertainty. The variety of types and sources of uncertainty, along with the lack of agreed terminology, can generate considerable confusion.
7 - Performing Probability Assessment
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 141-171
-
- Chapter
- Export citation
-
Summary
“It sounded quite a sensible voice, but it just said, “Two to the power of one hundred thousand to one against and falling,” and that was all.
Ford skidded down a beam of light and spun around but could see nothing he could seriously believe in.
“What was that voice?” shouted Arthur.
“I don't know,” yelled Ford, “I don't know. It sounded like a measurement of probability.”
“Probability? What do you mean?”
“Probability. You know, like two to one, three to one, five to four against. It said two to the power of one hundred thousand to one against. That's pretty improbable, you know.”
A million-gallon vat of custard upended itself over them without warning.
“But what does it mean?” cried Arthur.
“What, the custard?”
“No, the measurement of improbability?'
“I don't know. I don't know at all.”
Douglas Adams, The Hitchhiker's Guide to the Galaxy Harmony Books, New YorkThe preceding chapter has clearly indicated that understanding of human judgment under uncertainty is still very incomplete. Although it is possible to identify some things one should and should not do in eliciting subjective expert judgments, many aspects of the design of an elicitation protocol must be dealt with as a matter of judgment and taste. In order to give readers some appreciation of the range of approaches that analysts have adopted, we begin with a farily detailed description of elicitation procedures that have been developed and used by three different groups working in somewhat different contexts.
2 - Recent Milestones
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 6-15
-
- Chapter
- Export citation
-
Summary
Probabilities direct the conduct of the wise man.
Cicero, De Natura Deorum, Book 1, chap. 5, sec. 12Although a considerable theoretical literature and a number of small scale applications of techniques for dealing with uncertainty in policy analysis and policy focused research have been around for a couple of decades, larger applications to major policy problems are a more recent phenomenon. Because many of the ideas and techniques involved are new, even to members of the technical community, their introduction into policy circles has been uneven and accompanied by a variety of mistakes and false starts. Nevertheless, there have now been a number of important applications, and several U. S. federal agencies have become seriously committed to the continued development and use of these techniques. For readers unfamiliar with these developments we briefly motivate the discussions that follow with three case examples that involve techniques for incorporating an explicit treatment of uncertainty in (1) estimates of the safety of light-water nuclear reactors; (2) the regulatory analysis of common (“criteria”) air pollutants; and (3) estimates of the probable impacts on the ozone layer of continued release of chlorofluorocarbons.
Reactor Safety
One of the earliest large-scale studies to employ a formal treatment of uncertainty was the Reactor Safety Study, NUREG-75/014 (WASH-1400), generally known as the “Rasmussen report” (Rasmussen et al., 1975).
6 - Human Judgment about and with Uncertainty
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 102-140
-
- Chapter
- Export citation
-
Summary
We dance around in a ring and suppose,
But the secret sits in the middle and knows.
Robert FrostWhen the value of an uncertain quantity is needed in policy analysis, and limits in data or understanding preclude the use of conventional statistical techniques to produce probabilistic estimates, about the only remaining option is to ask experts for their best professional judgment. The past twenty years have witnessed remarkable progress in the development of understanding of how both experts and laypersons make judgments that involve uncertainty. Much of this new knowledge is directly relevant to problems encountered in quantitative policy analysis, especially to the elicitation of subjective probability distributions from experts.
We begin this chapter with a brief look at the psychology of judgment under uncertainty. We then turn to a somewhat more detailed review of experimental findings on the psychology, and some of the mechanics, of probability assessment. Most findings reported involve “encyclopedia”-type questions posed to nonexpert subjects. We close the chapter by examining the question “are experts different?” We defer until Chapter 7 a discussion of practical protocols for probability assessment.
The Psychology of Judgment under Uncertainty
When, as experts or laypersons, we think about and make judgments in the presence of uncertainty, we make use of a set of heuristic procedures. These procedures serve us well in many circumstances (Nisbett and Ross, 1980).
9 - The Graphic Communication of Uncertainty
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 220-256
-
- Chapter
- Export citation
-
Summary
“The purpose of computing is insight, not numbers.”
Richard HammingIntroduction
If, to paraphrase Richard Hamming's remark about computing, the goal of policy analysis is insight, not numbers, then clearly one of the most important challenges of policy analysis is to communicate the insights it provides to those who need them. Such insights can include an appreciation of the overall degree of uncertainty about the “bottom line” conclusions, an understanding of which sources of uncertainty and which modeling assumptions are critical to those conclusions and which are not, and an understanding of the extent to which plausible alternative assumptions can change the conclusions that are reached. The insights obtained will ultimately be qualitative in nature, even if the models they derive from are quantitative. This means it is incumbent on the analyst to find ways to present quantitative results in a manner that will most clearly communicate the information they contain and aid users in developing the appropriate qualitative insights. Most experienced analysts believe that graphical techniques play an indispensable role in this process, yet the use of graphics to communicate uncertain information has been the focus of remarkably little attention. This chapter is concerned with exploring some of the alternatives that are available for graphic presentation of uncertain quantitative information, and some of the necessary tradeoffs between simplicity and sophistication, particularly in choosing the dimensionality of information to present.
Frontmatter
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp i-iv
-
- Chapter
- Export citation
Uncertainty
- A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis
- Millett Granger Morgan, Max Henrion
-
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990
-
The authors explain the ways in which uncertainty is an important factor in the problems of risk and policy analysis. This book outlines the source and nature of uncertainty, discusses techniques for obtaining and using expert judgment, and reviews a variety of simple and advanced methods for analyzing uncertainty.
5 - Probability Distributions and Statistical Estimation
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 73-101
-
- Chapter
- Export citation
-
Summary
The theory of probability is at the bottom nothing but common sense reduced to calculus.
Pierre de Laplace, Théorie Analytique de Probabilités, introduction, 1812–20Introduction
Suppose you wish to represent uncertainty about an empirical quantity by a probability distribution. If you have some empirical data directly relevant to the quantity, you may want to use statistical methods to help select a distribution and estimate its parameters. If you feel you know little or nothing about the quantity before seeing the observations, you may want to use standard classical statistical methods. If you feel you do have some prior opinions about the quantity, based on whatever knowledge and reasoning, you may want to combine these prior opinions with the observed evidence and use Rayesian updating methods to obtain posterior distributions. If you have no directly relevant observations of the quantity, then you may wish to express your opinion directly by a subjective probability distribution. We discuss methods for doing this Chapters 6 and 7, but before getting to them it will be useful to review some of the basic properties of probability distributions.
The methods presented in this chapter cover a range of useful procedures in probability and statistics. The discussion is provided as a review and a compact reference and is not intended to replace a full course or text on the subject, many of which are available (e.g., Benjamin and Cornell, 1970; Ang and Tang, 1975; DeGroot, 1975).
3 - An Overview of Quantitative Policy Analysis
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 16-46
-
- Chapter
- Export citation
-
Summary
Modest doubt is called the beacon of the wise.
William Shakespeare, Troilus and Cressida, II, ii, 56There is something wonderfully absorbing about the details of policy analysis and policy-focused research. Indeed, the balance of this book will be devoted to a discussion of just one set of details, those related to the characterization and treatment of uncertainty. But, before proceeding, we should explore some broader issues. We need to clarify what we mean by policy analysis and policy research, and we must ask “What constitutes good policy analysis?” We first do this indirectly by comparing policy analysis to scientific research. Then we explore the role of analysis in the process of developing and choosing policy alternatives, examine a variety of alternative philosophical frameworks within which analysis may be undertaken, and explore some of the problems of choosing the boundaries for a policy analysis. Finally, we return to the problem of identifying the attributes of “good” policy analysis. This time our approach is more direct. We examine the motivations people have in commissioning and undertaking analysis and work from these motivations to develop what we term “ten commandments” for good policy analysis, one of which involves being explicit about uncertainties. The chapter concludes with a discussion of when and why policy analysis should deal with uncertainty.
Policy Research and Policy Analysis
Throughout the preceding chapters we have referred to policy research and analysis, sometimes using the even more awkward phrase “policy analysis and policy-focused research.”
12 - The Value of Knowing How Little You Know
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 307-324
-
- Chapter
- Export citation
-
Summary
“As for me, all I know is I know nothing.”
Socrates, Phaedrus, sec. 235An awareness of the limitations of one's knowledge has long been recognized as an important aspect of wisdom. Socrates went so far as to maintain that his wisdom consisted solely in his recognition of the extent of human ignorance in his attempt to explain why the Delphic oracle had pronounced him wisest among men. While it may be easy to admit the virtue of this “Socratic wisdom,” it seems less easy to maintain a full consciousness of it when engaged in practical decision-making.
As discussed in Section 4.2, the development of the personalistic or Bayesian theory of probability (Savage, 1954; de Finetti, 1974) provides a way for us to conceptualize some kinds of ignorance by characterizing our degrees of uncertainty in terms of subjective probabilities. Bayesian decision theory builds on this to provide a conceptual framework for explicitly incorporating our uncertainties about our information in the process of making decisions. The art of decision analysis that has sprung from this theoretical work offers a range of techniques intended to make these developments applicable for practical decision-making, at least for those decisions worth systematic attention. To the extent that we can capture our opinions of the limitations of our knowledge by subjective probability distributions over the quantities we are uncertain about, decision analysis appears to provide a way of operationalizing this Socratic precept.
8 - The Propagation and Analysis of Uncertainty
- Millett Granger Morgan, Carnegie Mellon University, Pennsylvania, Max Henrion, Carnegie Mellon University, Pennsylvania
-
- Book:
- Uncertainty
- Published online:
- 05 June 2012
- Print publication:
- 31 August 1990, pp 172-219
-
- Chapter
- Export citation
-
Summary
Introduction
Suppose we have constructed a model to predict the consequences of various possible events and decisions. And suppose further we have identified various uncertainties in the inputs. How can we propagate these uncertainties through the model to discover the uncertainty in the predicted consequences? If the uncertainties are substantial, we may not immediately be able to make definitive recommendations about what decision is “best.” But we should be able to obtain useful insights about the relative importance to our conclusions of the various assumptions, decisions, uncertainties, and disagreements in the inputs. These can help us decide whether it is likely to be worthwhile gathering more information, making more careful uncertainty assessments, or refining the model, and which of these could most reduce the uncertainty in the conclusions. In this chapter, we examine various analytic and computational techniques for examining the effects of uncertain inputs within a model. These include:
methods for computing the effect of changes in inputs on model predictions, i.e., sensitivity analysis,
methods for calculating the uncertainty in the model outputs induced by the uncertainties in its inputs, i.e., uncertainty propagation, and
methods for comparing the importance of the input uncertainties in terms of their relative contributions to uncertainty in the outputs, i.e., uncertainty analysis.
A considerable variety of such methods have been developed, with wide differences in conceptual approach, computational effort required, and the power of their results.