Skip to main content
×
Home
    • Aa
    • Aa

Is There a Cost to Convenience? An Experimental Comparison of Data Quality in Laboratory and Online Studies

  • Scott Clifford (a1) and Jennifer Jerit (a2)
Abstract
Abstract

Increasingly, experimental research is being conducted on the Internet in addition to the laboratory. Online experiments are more convenient for subjects and researchers, but we know little about how the choice of study location affects data quality. To investigate whether respondent behavior differs across study location, we randomly assign subjects to participate in a study in a laboratory or in an online setting. Contrary to our expectations, we find few differences between participants in terms of the level of attention and socially desirable responding. However, we find significant differences in two areas: the degree of self-reported distractions while completing the questionnaire and the tendency to consult outside sources for answers to political knowledge questions. We conclude that when the greater convenience (and higher response rates) of online experiments outweighs these disadvantages, Internet administration of randomized experiments represent an alternative to laboratory administration.

Copyright
Linked references
Hide All

This list contains references from the content that can be linked to their source. For a full set of references and notes please see the PDF or HTML where available.

A. J. Berinsky 1999. Can We Talk? Self-Presentation and the Survey Response. Political Psychology 25 (4): 643–59.

D. T. R. Berry , M. W. Wetter , R. A. Baer , L. Larsen , C. Clark , and K. Monroe 1992. MMPI-2 Random Responding Indices: Validation Using a Self-Report Methodology. Psychological Assessment 4 (3): 340–45.

L. Chang , and J. A. Krosnick 2009. National Surveys via RDD Telephone Interviewing versus the Internet. Public Opinion Quarterly 73 (4): 641–78.

L. Chang , and J. A. Krosnick 2010. Comparing Oral Interviewing with Self-Administered Computerized Questionnaires. Public Opinion Quarterly 74 (1): 154–67.

D. C. Evans , D. J. Garcia , D. M. Garcia , and R. S. Baron 2003. In the Privacy of Their Own Homes: Using the Internet to Assess Racial Bias. Personality and Social Psychology Bulletin 29 (2): 273–84.

A. Gerber 2011. Field Experiments in Political Science. In Handbook of Experimental Political Science eds. J. Druckman , D. P. Green , J. H. Kuklinski , and A. Lupia . New York: Cambridge University Press, 115–38.

J. K. Goodman , C. E. Cryder , and A. Cheema 2013. Data Collection in a Flat World: The Strengths and Weaknesses of Mechanical Turk Samples. Journal of Behavioral Decision Making 26 (3): 213–24.

J. L. Huang , P. G. Curran , J. Keeney , E. M. Poposki , and R. P. DeShon 2012. Detecting and Deterring Insufficient Effort Responding to Surveys. Journal of Business Psychology 27: 99114.

J. Jerit , J. Barabas , and S. Clifford 2013. Comparing Contemporaneous Laboratory and Field Experiments on Media Effects. Public Opinion Quarterly 77 (1): 256–82.

Y. Lelkes , J. A. Krosnick , D. M. Marx , C. N. Judd , and B. Park 2012. Complete Anonymity Compromises the Accuracy of Self-Reports. Journal of Experimental Social Psychology 48: 1291–99.

M. Lodge , and C. S. Taber 2013. The Rationalizing Voter. New York: Cambridge University Press.

A. W. Meade , and S. B. Craig 2012. Identifying Careless Responses in Survey Data. Psychological Methods 17 (3): 437–55.

R. McDermott 2002. Experimental Methods in Political Science. Annual Review of Political Science 5: 3161.

R. B. Morton , and K. C. Williams 2010. Experimental Political Science and the Study of Causality. New York: Cambridge University Press.

D. Oppenheimer , T. Meyvis , and N. Davidenko 2009. Instructional Manipulation Checks: Detecting Satisficing to Increase Statistical Power. Journal of Experimental Social Psychology 45: 867–72.

M. Prior 2009. Improving Media Effects Research Through Better Measurement of News Exposure. Journal of Politics 71 (3): 893908.

R. Tourangeau , and T. W. Smith 1996. Asking Sensitive Questions: The Impact of Data Collection Mode, Question Format, and Question Context. Public Opinion Quarterly 60 (2): 275304.

R. Tourangeau , and T. Yan 2007. Sensitive Survey Questions. Psychological Bulletin 133 (5): 859–83.

A. Weigold , I. K. Weigold , and E. J. Russell 2013. Examination of the Equivalence of Self-Report Survey-Based Paper-and-Pencil and Internet Data Collection Methods. Psychological Methods 18 (1): 5370.

J. Weinberger , and D. Westen 2008. RATS, We Should Have Used Clinton: Subliminal Priming in Political Campaigns. Political Psychology 29 (5): 631–51.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Journal of Experimental Political Science
  • ISSN: 2052-2630
  • EISSN: 2052-2649
  • URL: /core/journals/journal-of-experimental-political-science
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Type Description Title
PDF
Supplementary Materials

Clifford and Jerit supplementary material
Appendix

 PDF (910 KB)
910 KB
PDF
Supplementary Materials

Clifford and Jerit Supplementary Material
Appendix

 PDF (910 KB)
910 KB

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 5
Total number of PDF views: 131 *
Loading metrics...

Abstract views

Total abstract views: 344 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 22nd June 2017. This data will be updated every 24 hours.