Skip to main content
×
×
Home

Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results

  • Annie Franco (a1), Neil Malhotra (a2) and Gabor Simonovits (a3)
Abstract

The accuracy of published findings is compromised when researchers fail to report and adjust for multiple testing. Preregistration of studies and the requirement of preanalysis plans for publication are two proposed solutions to combat this problem. Some have raised concerns that such changes in research practice may hinder inductive learning. However, without knowing the extent of underreporting, it is difficult to assess the costs and benefits of institutional reforms. This paper examines published survey experiments conducted as part of the Time-sharing Experiments in the Social Sciences program, where the questionnaires are made publicly available, allowing us to compare planned design features against what is reported in published research. We find that: (1) 30% of papers report fewer experimental conditions in the published paper than in the questionnaire; (2) roughly 60% of papers report fewer outcome variables than what are listed in the questionnaire; and (3) about 80% of papers fail to report all experimental conditions and outcomes. These findings suggest that published statistical tests understate the probability of type I errors.

Copyright
Corresponding author
e-mail: neilm@stanford.edu (corresponding author)
Footnotes
Hide All

Author's note: Supplementary materials for this article are available on the Political Analysis Web site. Replication data are available on the Dataverse site for this article, http://dx.doi.org/10.7910/DVN/28766.

Footnotes
References
Hide All
Anderson, Richard G. 2013. Registration and replication: A comment. Political Analysis 21:3839.
Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J. A., Fiedler, K., Fiedler, S., Funder, D. C., Kliegl, R., Nosek, B. A., Perugini, M., Roberts, B. W., Schmitt, M., van Aken, M. A. G., Weber, H., and Wicherts, J. M. 2013. Recommendations for increasing replicability in psychology. European Journal of Personality 27:108–19.
Casey, Katherine, Glennerster, Rachel, and Miguel, Edward. 2012. Reshaping institutions: Evidence on aid impacts using a preanalysis plan. Quarterly Journal of Economics 127:17551812.
Chambers, Christopher D. 2013. Registered reports: A new publishing initiative at Cortex. Cortex 49:609–10.
Chan, An-Wen, and Altman, Douglas G. 2005. Identifying outcome reporting bias in randomised trials on PubMed: Review of publications and survey of authors. British Medical Journal 330:753.
Chan, An-Wen, Hróbjartsson, Asbjørn, Haahr, Mette T., Gøtzsche, Peter C., and Altman, Douglas G. 2004. Empirical evidence for selective reporting of outcomes in randomized trials: comparison of protocols to published articles. Journal of the American Medical Association 291:2457–65.
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2014. Publication bias in the social sciences: Unlocking the file drawer. Science 345:15021505.
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2015. Replication data for: Underreporting in political science survey experiments: Comparing questionnaires to published results. http://dx.doi.org/10.7910/DVN/28766 Dataverse [Distributor] V1 [Version], January 21, 2015.
Gelman, Andrew, Hill, Jennifer, and Yajima, Masanao. 2012. Why we (usually) don't have to worry about multiple comparisons. Journal of Research on Educational Effectiveness 5:189211.
Gelman, Andrew, and Tuerlinckx, Francis. 2000. Type S error rates for classical and Bayesian single and multiple comparison procedures. Computational Statistics 15:373–90.
Gerber, Alan, and Malhotra, Neil. 2008. Do statistical reporting standards affect what is published? Publication bias in two leading political science journals. Quarterly Journal of Political Science 3:313–26.
Humphreys, Macartan, Sanchez de la Sierra, Raul, and van der Windt, Peter. 2013. Fishing, commitment, and communication: A proposal for comprehensive nonbinding research registration. Political Analysis 21:120.
King, Gary, Gakidou, Emmanuela, Ravishankar, Ninnala, Moore, Ryan T., Lakin, Jason, Vargas, Manett, María Téllez-Rojo, Martha, Hernández Ávila, Juan Eugenio, Hernández Ávila, Mauricio, and Hernández Llamas, Héctor. 2007. A “politically robust” experimental design for public policy evaluation, with application to the Mexican Universal Health Insurance Program. Journal of Policy Analysis and Management 26:479506.
Laitin, David D. 2013. Fisheries management. Political Analysis 21:4247.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., Glennerster, R., Green, D. P., Humphreys, M., Imbens, G., Laitin, D., Madon, T., Nelson, L., Nosek, B. A., Petersen, M., Sedlmayr, R., Simmons, J. P., Simonsohn, U., and Van der Laan, M. 2014. Promoting transparency in social science research. Science 343:3031.
Monogan, III, James, E. 2013. A case for registering studies of political outcomes: An application in the 2010 House elections. Political Analysis 21:2137.
Moreno, Santiago G., Sutton, Alex J., Turner, Erick H., Abrams, Keith R., Cooper, Nicola J., Palmer, Tom M., and Ades, A. E. 2009. Novel methods to deal with publication biases: Secondary analysis of antidepressant trials in the FDA trial registry database and related journal publications. British Medical Journal 339:b2981.
Rising, Kristin, Bacchetti, Peter, and Bero, Lisa. 2008. Reporting bias in drug trials submitted to the Food and Drug Administration: Review of publication and presentation. PLoS Medicine 5:217.
Simmons, Joseph P., Nelson, Leif D., and Simonsohn, Uri. 2011. False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science 22:13591366.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Political Analysis
  • ISSN: 1047-1987
  • EISSN: 1476-4989
  • URL: /core/journals/political-analysis
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×
MathJax
Type Description Title
PDF
Supplementary materials

Franco et al. supplementary material
Appendix

 PDF (193 KB)
193 KB

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 92 *
Loading metrics...

Abstract views

Total abstract views: 395 *
Loading metrics...

* Views captured on Cambridge Core between 4th January 2017 - 18th September 2018. This data will be updated every 24 hours.