Skip to main content Accessibility help
×
Home
Hostname: page-component-55597f9d44-jzjqj Total loading time: 0.312 Render date: 2022-08-09T23:50:42.688Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "useRatesEcommerce": false, "useNewApi": true } hasContentIssue true

Standard Operating Procedures: A Safety Net for Pre-Analysis Plans

Published online by Cambridge University Press:  15 July 2016

Winston Lin
Affiliation:
Columbia University
Donald P. Green
Affiliation:
Columbia University

Abstract

Across the social sciences, growing concerns about research transparency have led to calls for pre-analysis plans (PAPs) that specify in advance how researchers intend to analyze the data they are about to gather. PAPs promote transparency and credibility by helping readers distinguish between exploratory and confirmatory analyses. However, PAPs are time-consuming to write and may fail to anticipate contingencies that arise in the course of data collection. This article proposes the use of “standard operating procedures” (SOPs)—default practices to guide decisions when issues arise that were not anticipated in the PAP. We offer an example of an SOP that can be adapted by other researchers seeking a safety net to support their PAPs.

Type
The Profession
Copyright
Copyright © American Political Science Association 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Anderson, Michael L. 2008. “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” Journal of the American Statistical Association 103 (484): 14811495.CrossRefGoogle Scholar
Angrist, Joshua D., Imbens, Guido W., and Rubin, Donald B.. 1996. “Identification of Causal Effects Using Instrumental Variables.” Journal of the American Statistical Association 91 (434): 444455.CrossRefGoogle Scholar
Bidwell, Kelly, Casey, Katherine, and Glennerster, Rachel. 2015. “The Impact of Voter Knowledge Initiatives in Sierra Leone.” AEA RCT Registry. https://www.socialscienceregistry.org/trials/26.
Brodeur, Abel, , Mathias, Sangnier, Marc, and Zylberberg, Yanos. 2016. “Star Wars: The Empirics Strike Back.” American Economic Journal: Applied Economics 8 (1): 132.Google Scholar
Casey, Katherine, Glennerster, Rachel, and Miguel, Edward. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan.” Quarterly Journal of Economics 127 (4): 17551812.CrossRefGoogle Scholar
Chambers, Christopher D., Feredoes, Eva, Muthukumaraswamy, Suresh D., and Etchells, Peter J.. 2014. “Instead of ‘Playing the Game’ It Is Time to Change the Rules: Registered Reports at AIMS Neuroscience and Beyond.” AIMS Neuroscience 1 (1): 417.CrossRefGoogle Scholar
Chan, An-Wen, Tetzlaff, Jennifer M., Gøtzsche, Peter C., Altman, Douglas G., Mann, Howard, Berlin, Jesse A., Dickersin, Kay, Hróbjartsson, Asbjørn, Schulz, Kenneth F., Parulekar, Wendy R., Krleža-Jeric, Karmela, Laupacis, Andreas, and Moher, David. 2013. “SPIRIT 2013 Explanation and Elaboration: Guidance for Protocols of Clinical Trials.” BMJ 346: e7586.CrossRefGoogle Scholar
Efron, Bradley. 2010. Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction. New York: Cambridge University Press.CrossRefGoogle Scholar
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2014. “Publication Bias in the Social Sciences: Unlocking the File Drawer.” Science 345 (6203): 15021505.CrossRefGoogle Scholar
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2016. “Underreporting in Psychology Experiments: Evidence from a Study Registry.” Social Psychological and Personality Science 7 (1): 812.CrossRefGoogle Scholar
Freedman, David A. 2008. “Oasis or Mirage?” Chance 21 (1): 5961.CrossRefGoogle Scholar
Freedman, David A. 2010. “Survival Analysis: An Epidemiological Hazard?” In Statistical Models and Causal Inference: A Dialogue with the Social Sciences, ed. Collier, David, Sekhon, Jasjeet S., and Stark, Philip B., 169192. New York: Cambridge University Press.Google Scholar
Gerber, Alan and Malhotra, Neil. 2008. “Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals.” Quarterly Journal of Political Science 3 (3): 313326.CrossRefGoogle Scholar
Humphreys, Macartan, Sanchez de la Sierra, Raul, and van der Windt, Peter. 2013. “Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration.” Political Analysis 21 (1): 120.CrossRefGoogle Scholar
Lin, Winston, Green, Donald P., and Coppock, Alexander. 2015. “Standard Operating Procedures for Don Green’s Lab at Columbia.” https://github.com/acoppock/Green-Lab-SOP.
McKenzie, David. 2012. “A Pre-Analysis Plan Checklist.” World Bank. http://blogs.worldbank.org/impactevaluations/a-pre-analysis-plan-checklist.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., Glennerster, R., Green, D. P., Humphreys, M., Imbens, G., Laitin, D., Madon, T., Nelson, L., Nosek, B. A., Petersen, M., Sedlmayr, R., Simmons, J. P., Simonsohn, U., and van der Laan, M.. 2014. “Promoting Transparency in Social Science Research.” Science 343 (6166): 3031.CrossRefGoogle Scholar
Monogan, James E. III. 2013. “A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections.” Political Analysis 21 (1): 2137.CrossRefGoogle Scholar
Monogan, James E. III. 2015. “Research Preregistration in Political Science: The Case, Counterarguments, and a Response to Critiques.” PS: Political Science and Politics 48 (03): 425429.Google Scholar
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., Ishiyama, J., Karlan, D., Kraut, A., Lupia, A., Mabry, P., Madon, T., Malhotra, N., Mayo-Wilson, E., McNutt, M., Miguel, E., Paluck, E. L., Simonsohn, U., Soderberg, C., Spellman, B. A., Turitto, J., VandenBos, G., Vazire, S., Wagenmakers, E. J., Wilson, R., and Yarkoni, T.. 2015. “Promoting an Open Research Culture.” Science 348 (6242): 14221425.CrossRefGoogle Scholar
Nyhan, Brendan. 2015. “Increasing the Credibility of Political Science Research: A Proposal for Journal Reforms.” PS: Political Science and Politics 48 (S1): 7883.Google Scholar
O’Donoghue, Ted and Rabin, Matthew. 2001. “Choice and Procrastination.” Quarterly Journal of Economics 116 (1): 121160.CrossRefGoogle Scholar
Olken, Benjamin A. 2015. “Promises and Perils of Pre-Analysis Plans.” Journal of Economic Perspectives 29 (3): 6180.CrossRefGoogle Scholar
Open Science Collaboration. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349: aac4716.
Rosenthal, Robert. 1979. “The ‘File Drawer Problem’ and Tolerance for Null Results.” Psychological Bulletin 86 (3): 638641.CrossRefGoogle Scholar
Rubin, Donald B. 2007. “The Design versus the Analysis of Observational Studies for Causal Effects: Parallels with the Design of Randomized Trials.” Statistics in Medicine 26 (1): 2036.CrossRefGoogle Scholar
Simmons, Joseph P., Nelson, Leif D., and Simonsohn, Uri. 2011. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science 22 (11): 13591366.CrossRefGoogle Scholar
Tukey, J. W. 1993. “Tightening the Clinical Trial.” Controlled Clinical Trials 14 (4): 266285.CrossRefGoogle Scholar
Westfall, Peter H., Tobias, Randall D., and Wolfinger, Russell D.. 2011. Multiple Comparisons and Multiple Tests Using SAS. 2nd ed. Cary, NC: SAS Institute.Google Scholar
22
Cited by

Save article to Kindle

To save this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Standard Operating Procedures: A Safety Net for Pre-Analysis Plans
Available formats
×

Save article to Dropbox

To save this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about saving content to Dropbox.

Standard Operating Procedures: A Safety Net for Pre-Analysis Plans
Available formats
×

Save article to Google Drive

To save this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about saving content to Google Drive.

Standard Operating Procedures: A Safety Net for Pre-Analysis Plans
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *