Skip to main content Accessibility help

Declaring and Diagnosing Research Designs



Researchers need to select high-quality research designs and communicate those designs clearly to readers. Both tasks are difficult. We provide a framework for formally “declaring” the analytically relevant features of a research design in a demonstrably complete manner, with applications to qualitative, quantitative, and mixed methods research. The approach to design declaration we describe requires defining a model of the world (M), an inquiry (I), a data strategy (D), and an answer strategy (A). Declaration of these features in code provides sufficient information for researchers and readers to use Monte Carlo techniques to diagnose properties such as power, bias, accuracy of qualitative causal inferences, and other “diagnosands.” Ex ante declarations can be used to improve designs and facilitate preregistration, analysis, and reconciliation of intended and actual analyses. Ex post declarations are useful for describing, sharing, reanalyzing, and critiquing existing designs. We provide open-source software, DeclareDesign, to implement the proposed approach.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the or variations. ‘’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Declaring and Diagnosing Research Designs
      Available formats

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Declaring and Diagnosing Research Designs
      Available formats

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Declaring and Diagnosing Research Designs
      Available formats


This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (, which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.

Corresponding author

*Graeme Blair, Assistant Professor of Political Science, University of California, Los Angeles,,
Jasper Cooper, Assistant Professor of Political Science, University of California, San Diego,,
Alexander Coppock, Assistant Professor of Political Science, Yale University,,
**Macartan Humphreys, WZB Berlin, Professor of Political Science, Columbia University,,


Hide All

Authors are listed in alphabetical order. This work was supported in part by a grant from the Laura and John Arnold Foundation and seed funding from EGAP—Evidence in Governance and Politics. Errors remain the responsibility of the authors. We thank the Associate Editor and three anonymous reviewers for generous feedback. In addition, we thank Peter Aronow, Julian Brückner, Adrian Duşa Adam Glynn, Donald Green, Justin Grimmer, Kolby Hansen, Erin Hartman, Alan Jacobs, Tom Leavitt, Winston Lin, Matto Mildenberger, Matthias Orlowski, Molly Roberts, Tara Slough, Gosha Syunyaev, Anna Wilke, Teppei Yamamoto, Erin York, Lauren Young, and Yang-Yang Zhou; seminar audiences at Columbia, Yale, MIT, WZB, NYU, Mannheim, Oslo, Princeton, Southern California Methods Workshop, and the European Field Experiments Summer School; as well as participants at the EPSA 2016, APSA 2016, EGAP 18, BITSS 2017, and SPSP 2018 meetings for helpful comments. We thank Clara Bicalho, Neal Fultz, Sisi Huang, Markus Konrad, Lily Medina, Pete Mohanty, Aaron Rudkin, Shikhar Singh, Luke Sonnet, and John Ternovski for their many contributions to the broader project. The methods proposed in this paper are implemented in an accompanying open-source software package, DeclareDesign (Blair et al. 2018). Replication files are available at the American Political Science Review Dataverse:



Hide All
Angrist, Joshua D., and Pischke, Jörn-Steffen. 2008. Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton, NJ: Princeton University Press.
Aronow, Peter M., and Samii, Cyrus. 2016. “Does Regression Produce Representative Estimates of Causal Effects?American Journal of Political Science 60 (1): 250–67.
Balke, Alexander, and Pearl, Judea. 1994. Counterfactual Probabilities: Computational Methods, Bounds and Applications. In Proceedings of the Tenth International Conference on Uncertainty in Artificial Intelligence. Burlington, MA: Morgan Kaufmann Publishers, 4654.
Baumgartner, Michael, and Thiem, Alrik. 2017. “Often Trusted but Never (Properly) Tested: Evaluating Qualitative Comparative Analysis.” Sociological Methods & Research: 133. Published first online 3 May 2017.
Beach, Derek, and Pedersen, Rasmus Brun. 2013. Process-Tracing Methods: Foundations and Guidelines. Ann Arbor, MI: University of Michigan Press.
Bennett, Andrew. 2015. Disciplining Our Conjectures: Systematizing Process Tracing with Bayesian Analysis. In Process Tracing, eds. Bennett, Andrew and Checkel, Jeffrey T.. Cambridge: Cambridge University Press, 276–98.
Bennett, Andrew, and Checkel, Jeffrey T., eds. 2014. Process Tracing. Cambridge: Cambridge University Press.
Björkman, Martina, and Svensson, Jakob. 2009. “Power to the People: Evidence from a Randomized Field Experiment of a Community-Based Monitoring Project in Uganda.” Quarterly Journal of Economics 124 (2): 735–69.
Björkman Nyqvist, Martina, and Svensson, Jakob. 2016. “Comments on Donato and Mosqueira’s (2016) ‘additional Analyses’ of Björkman and Svensson (2009).” Unpublished research note.
Blair, Graeme, Cooper, Jasper, Coppock, Alexander, Humphreys, Macartan, and Fultz, Neal. 2018. “DeclareDesign.” Software package for R, available at
Blei, David M., Ng, Andrew Y., and Jordan, Michael I.. 2003. “Latent Dirichlet Allocation.” Journal of Machine Learning Research 3: 9931022.
Brady, Henry E., and Collier, David. 2010. Rethinking Social Inquiry: Diverse Tools, Shared Standards. Lanham, MD: Rowman & Littlefield Publishers.
Braumoeller, Bear F. 2003. “Causal Complexity and the Study of Politics.” Political Analysis 11 (3): 209–33.
Casey, Katherine, Glennerster, Rachel, and Miguel, Edward. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Pre-Analysis Plan.” Quarterly Journal of Economics 127 (4): 1755–812.
Clemens, Michael A. 2017. “The Meaning of Failed Replications: A Review and Proposal.” Journal of Economic Surveys 31 (1): 326–42.
Cohen, Jacob. 1977. Statistical Power Analysis for the Behavioral Sciences. New York, NY: Academic Press.
Collier, David. 2011. “Understanding Process Tracing.” PS: Political Science & Politics 44 (4): 823–30.
Collier, David. 2014. “Comment: QCA Should Set Aside the Algorithms.” Sociological Methodology 44 (1): 122–6.
Collier, David, Brady, Henry E., and Seawright, Jason. 2004. Sources of Leverage in Causal Inference: Toward an Alternative View of Methodology. In Rethinking Social Inquiry: Diverse Tools, Shared Standards, eds. Collier, David and Brady, Henry E.. Lanham, MD: Rowman and Littlefield, 229–66.
Coppock, Alexander. 2019. “Avoiding Post-Treatment Bias in Audit Experiments.” Journal of Experimental Political Science 6(1): 14.
Cox, David R. 1975. “A Note on Data-Splitting for the Evaluation of Significance Levels.” Biometrika 62 (2): 441–4.
Dawid, A.Philip. 2000. “Causal Inference without Counterfactuals.” Journal of the American Statistical Association 95 (450): 407–24.
De la Cuesta, Brandon, and Imai, Kosuke. 2016. “Misunderstandings about the Regression Discontinuity Design in the Study of Close Elections.” Annual Review of Political Science 19: 375–96.
Deaton, Angus S. 2010. “Instruments, Randomization, and Learning about Development.” Journal of Economic Literature 48 (2): 424–55.
Donato, Katherine, and Garcia Mosqueira, Adrian. 2016. “Power to the People? A Replication Study of a Community-Based Monitoring Programme in Uganda.” 3ie Replication Papers 11.
Dunning, Thad. 2012. Natural Experiments in the Social Sciences: A Design-Based Approach. Cambridge: Cambridge University Press.
Duşa, Adrian. 2018. QCA with R. A Comprehensive Resource. New York, NY: Springer.
Duşa, Adrian, and Thiem, Alrik. 2015. “Enhancing the Minimization of Boolean and Multivalue Output Functions with e QMC.” Journal of Mathematical Sociology 39 (2): 92108.
Fairfield, Tasha. 2013. “Going where the Money Is: Strategies for Taxing Economic Elites in Unequal Democracies.” World Development 47: 4257.
Fairfield, Tasha, and Charman, Andrew E.. 2017. “Explicit Bayesian Analysis for Process Tracing: Guidelines, Opportunities, and Caveats.” Political Analysis 25 (3): 363–80.
Findley, Michael G., Jensen, Nathan M., Malesky, Edmund J., and Pepinsky, Thomas B.. 2016. “Can Results-Free Review Reduce Publication Bias? The Results and Implications of a Pilot Study.” Comparative Political Studies 49 (13): 1667–703.
Geddes, Barbara. 2003. Paradigms and Sand Castles: Theory Building and Research Design in Comparative Politics. Ann Arbor, MI: University of Michigan Press.
Gelman, Andrew, and Hill, Jennifer. 2006. Data Analysis Using Regression and Multilevel/Hierarchical Models. Cambridge: Cambridge University Press.
Gelman, Andrew, and Carlin, John. 2014. “Beyond Power Calculations Assessing Type S (Sign) and Type M (Magnitude) Errors.” Perspectives on Psychological Science 9 (6): 641–51.
Gerber, Alan S., and Green, Donald P.. 2012. Field Experiments: Design, Analysis, and Interpretation. New York, NY: W.W. Norton.
Goertz, Gary, and Mahoney, James. 2012. A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences. Princeton, NJ: Princeton University Press.
Green, Donald P., and Lin, Winston. 2016. “Standard Operating Procedures: A Safety Net for Pre-analysis Plans.” PS: Political Science and Politics 49 (3): 495–9.
Green, Peter, and MacLeod, Catriona J.. 2016. “SIMR: An R Package for Power Analysis of Generalized Linear Mixed Models by Simulation.” Methods in Ecology and Evolution 7 (4): 493–8.
Groemping, Ulrike. 2016. “Design of Experiments (DoE) & Analysis of Experimental Data.” Last accessed May 11, 2017.
Guo, Yi, Logan, Henrietta L., Glueck, Deborah H., and Muller, Keith E.. 2013. “Selecting a Sample Size for Studies with Repeated Measures.” BMC Medical Research Methodology 13 (1): 100.
Halpern, Joseph Y. 2000. “Axiomatizing Causal Reasoning.” Journal of Artificial Intelligence Research 12: 317–37.
Haseman, Joseph K. 1978. “Exact Sample Sizes for Use with the Fisher-Irwin Test for 2 × 2 Tables.” Biometrics 34 (1): 106–9.
Heckman, James J., Urzua, Sergio, and Vytlacil, Edward. 2006. “Understanding Instrumental Variables in Models with Essential Heterogeneity.” The Review of Economics and Statistics 88 (3): 389432.
Herron, Michael C., and Quinn, Kevin M.. 2016. “A Careful Look at Modern Case Selection Methods.” Sociological Methods & Research 45 (3): 458–92.
Huber, John. 2013. “Is Theory Getting Lost in the ‘Identification Revolution’?The Monkey Cage blog post.
Hug, Simon. 2013. “Qualitative Comparative Analysis: How Inductive Use and Measurement Error lead to Problematic Inference.” Political Analysis 21 (2): 252–65.
Humphreys, Macartan, and Jacobs, Alan M.. 2015. “Mixing Methods: A Bayesian Approach.” American Political Science Review 109 (4): 653–73.
Imai, Kosuke, King, Gary, and Stuart, Elizabeth A.. 2008. “Misunderstandings between Experimentalists and Observationalists about Causal Inference.” Journal of the Royal Statistical Society: Series A 171 (2): 481502.
Imbens, Guido W. 2010. “Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009).” Journal of Economic Literature 48 (2): 399423.
Imbens, Guido W., and Rubin, Donald B.. 2015. Causal Inference in Statistics, Social, and Biomedical Sciences. Cambridge: Cambridge University Press.
King, Gary, Keohane, Robert O., and Verba, Sidney. 1994. Designing Social Inquiry: Scientific Inference in Qualitative Research. Princeton, NJ: Princeton University Press.
Kreidler, Sarah M., Muller, Keith E., Grunwald, Gary K., Ringham, Brandy M., Coker-Dukowitz, Zacchary T., Sakhadeo, Uttara R., Barón, Anna E., and Glueck, Deborah H.. 2013. “GLIMMPSE: Online Power Computation for Linear Models with and without a Baseline Covariate.” Journal of Statistical Software 54 (10) 126.
Lenth, Russell V. 2001. “Some Practical Guidelines for Effective Sample Size Determination.” The American Statistician 55 (3): 187–93.
Lieberman, Evan S. 2005. “Nested Analysis as a Mixed-Method Strategy for Comparative Research.” American Political Science Review 99 (3): 435–52.
Lohr, Sharon. 2010. Sampling: Design and Analysis. Boston: Brooks Cole.
Lucas, Samuel R., and Szatrowski, Alisa. 2014. “Qualitative Comparative Analysis in Critical Perspective.” Sociological Methodology 44 (1): 179.
Mackie, John Leslie. 1974. The Cement of the Universe: A Study of Causation. Oxford: Oxford University Press.
Mahoney, James. 2008. “Toward a Unified Theory of Causality.” Comparative Political Studies 41 (4–5): 412–36.
Mahoney, James. 2012. “The Logic of Process Tracing Tests in the Social Sciences.” Sociological Methods & Research 41 (4): 570–97.
Morris, Tim P., White, Ian R., and Crowther, Michael J.. 2019. “Using Simulation Studies to Evaluate Statistical Methods.” Statistics in Medicine.
Muller, Keith E., and Peterson, Bercedis L.. 1984. “Practical Methods for Computing Power in Testing the Multivariate General Linear Hypothesis.” Computational Statistics & Data Analysis 2 (2): 143–58.
Muller, Keith E., Lavange, Lisa M., Ramey, Sharon Landesman, and Ramey, Craig T.. 1992. “Power Calculations for General Linear Multivariate Models Including Repeated Measures Applications.” Journal of the American Statistical Association 87 (420): 1209–26.
Nosek, Brian A., Alter, George, Banks, George C., Borsboom, Denny, Bowman, Sara D., Breckler, Steven J., Buck, Stuart, et al. 2015. “Promoting an Open Research Culture: Author Guidelines for Journals Could Help to Promote Transparency, Openness, and Reproducibility.” Science 348 (6242): 1422.
Pearl, Judea. 2009. Causality. Cambridge: Cambridge University Press.
Raffler, Pia, Posner, Daniel N., and Parkerson, Doug. 2019. “The Weakness of Bottom-Up Accountability: Experimental Evidence from the Ugandan Health Sector.” Working Paper.
Ragin, Charles. 1987. The Comparative Method. Moving beyond Qualitative and Quantitative Strategies. Berkeley, CA: University of California Press.
Rennie, Drummond. 2004. “Trial Registration.” Journal of the American Medical Association: The Journal of the American Medical Association 292 (11): 1359–62.
Rohlfing, Ingo. 2018. “Power and False Negatives in Qualitative Comparative Analysis: Foundations, Simulation and Estimation for Empirical Studies.” Political Analysis 26 (1): 7289.
Rohlfing, Ingo, and Schneider, Carsten Q.. 2018. “A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research.” Sociological Methods & Research 47 (1): 3763.
Rosenbaum, Paul R. 2002. Observational Studies. New York, NY: Springer.
Rubin, Donald B. 1984. “Bayesianly Justifiable and Relevant Frequency Calculations for the Applied Statistician.” Annals of Statistics 12 (4): 1151–72.
Schneider, Carsten Q., and Wagemann, Claudius. 2012. Set-theoretic Methods for the Social Sciences: A Guide to Qualitative Comparative Analysis. Cambridge: Cambridge University Press.
Seawright, Jason, and Gerring, John. 2008. “Case Selection Techniques in Case Study Research: A Menu of Qualitative and Quantitative Options.” Political Research Quarterly 61 (2): 294308.
Sekhon, Jasjeet S., and Titiunik, Rocıo. 2016. “Understanding Regression Discontinuity Designs as Observational Studies.” Observational Studies 2: 173–81.
Shadish, William, Cook, Thomas D., and Campbell, Donald Thomas. 2002. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston, MA: Houghton Mifflin.
Tanner, Sean. 2014. “QCA Is of Questionable Value for Policy Research.” Policy and Society 33 (3): 287–98.
Thiem, Alrik, Baumgartner, Michael, and Bol, Damien. 2016. “Still Lost in Translation! A Correction of Three Misunderstandings between Configurational Comparativists and Regressional Analysts.” Comparative Political Studies 49 (6): 742–74.
Van Evera, Stephen. 1997. Guide to Methods for Students of Political Science. Ithaca, NY: Cornell University Press.
White, Halbert. 1982. “Maximum Likelihood Estimation of Misspecified Models.” Econometrica: Journal of the Econometric Society 50 (1): 125.
Yamamoto, Teppei. 2012. “Understanding the Past: Statistical Analysis of Causal Attribution.” American Journal of Political Science 56 (1): 237–56.
Zarin, Deborah A., and Tse, Tony. 2008. “Moving towards Transparency of Clinical Trials.” Science 319 (5868): 1340–2.
Zhang, Junni L., and Rubin, Donald B.. 2003. “Estimation of Causal Effects via Principal Stratification When Some Outcomes Are Truncated by ‘Death’.” Journal of Educational and Behavioral Statistics 28 (4): 353–68.

Related content

Powered by UNSILO
Type Description Title
Supplementary materials

Blair et al. supplementary material
Blair et al. supplementary material 1

 PDF (536 KB)
536 KB
Supplementary materials

Blair et al. Dataset


Declaring and Diagnosing Research Designs



Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed.