No CrossRef data available.
The Dangers of Extreme Counterfactuals
Published online by Cambridge University Press: 04 January 2017
We address the problem that occurs when inferences about counterfactuals—predictions, “what-if” questions, and causal effects—are attempted far from the available data. The danger of these extreme counterfactuals is that substantive conclusions drawn from statistical models that fit the data well turn out to be based largely on speculation hidden in convenient modeling assumptions that few would be willing to defend. Yet existing statistical strategies provide few reliable means of identifying extreme counterfactuals. We offer a proof that inferences farther from the data allow more model dependence and then develop easy-to-apply methods to evaluate how model dependent our answers would be to specified counterfactuals. These methods require neither sensitivity testing over specified classes of models nor evaluating any specific modeling assumptions. If an analysis fails the simple tests we offer, then we know that substantive results are sensitive to at least some modeling choices that are not based on empirical evidence. Free software that accompanies this article implements all the methods developed.
- Research Article
- Political Analysis , Volume 14 , Issue 2 , Spring 2006 , pp. 131 - 159
- Copyright © The Author 2005. Published by Oxford University Press on behalf of the Society for Political Methodology
- Cited by