Skip to main content Accessibility help
×
Home

Should We Worry About Sponsorship-Induced Bias in Online Political Science Surveys?

  • Thomas J. Leeper (a1) and Emily A. Thorson (a2)

Abstract

Political scientists rely heavily on survey research to gain insights into public attitudes and behaviors. Over the past decade, survey data collection has moved away from personal face-to-face and telephone interviewing towards a model of computer-assisted self-interviewing. A hallmark of many online surveys is the prominent display of the survey’s sponsor, most often an academic institution, in the initial consent form and/or on the survey website itself. It is an open question whether these displays of academic survey sponsorship could increase total survey error. We measure the extent to which sponsorship (by a university or marketing firm) affects data quality, including satisficing behavior, demand characteristics, and socially desirable responding. In addition, we examine whether sponsor effects vary depending on the participant’s experience with online surveys. Overall, we find no evidence that response quality is affected by survey sponsor or by past survey experience.

Copyright

Footnotes

Hide All

Citation of data: Authorship is equal and listed alphabetically. This paper was previously presented at the 2015 Annual Meeting of the Midwest Political Science Association, Chicago, IL. Thanks to Brad Jones for helpful feedback. The data, code, and any additional materials required to replicate all analyses in this article are available at the Journal of Experimental Political Science Dataverse within the Harvard Dataverse Network, at doi:10.7910/DVN/KKFS8Y.

Footnotes

References

Hide All
Belli, Robert F., Michael, W. Traugott, Young, Margaret and McGonagle, Katherine. 1996. Reducing Vote Overreporting in Surveys: Social Desirability, Memory Failure, and Source Monitoring. Public Opinion Quarterly 63(1): 90108.
Binswanger, Johannes, Schunk, Daniel and Toepoel, Vera. 2013. Panel Conditioning in Difficult Attitudinal Questions. Public Opinion Quarterly 77(3): 783–97.
Edwards, Michelle, Dillman, Don and Smyth, Jolene. 2014. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Public Opinion Quarterly 78(3): 734–50.
Fox, Richard, Crask, Melvin and Kim, Jonghoon. 1988. Mail Survey Response Rate: A Meta-Analysis of Selected Techniques for Inducing Response. Public Opinion Quarterly 52(4): 467–91.
Glynn, Adam N. 2013. What Can We Learn with Statistical Truth Serum?: Design and Analysis of the List Experiment. Public Opinion Quarterly 77(S1): 159–72. URL: http://poq.oxfordjournals.org/cgi/doi/10.1093/poq/nfs070
Groves, Robert, Cialdini, Robert and Couper, Mick. 1992. Understanding the Decision to Participate in a Survey. Public Opinion Quarterly 56(4): 475–95.
Hauser, David J. and Schwarz, Norbert. 2016. Attentive Turkers: MTurk Participants Perform Better on Online Attention Checks than Do Subject Pool Participants. Behavior Research Methods 48(1): 400407.
Jensen, Carsten and Thomsen, Jens Peter Frølund. 2013. Self-Reported Cheating in Web Surveys on Political Knowledge. Quality & Quantity 48(6): 3343–54.URL: http://link.springer.com/10.1007/s11135-013-9960-z
Jones, Wesley H. and Linda, Gerald. 1978. Multiple Criteria Effects in a Mail Survey Experiment. Journal of Marketing Research 15(2): 280–4.
Kreuter, Frauke, Preser, Stanley and Tourangeau, Roger. 2008. Social Desirability in CATI, IVR, and Web Surveys. Public Opinion Quarterly 72(5): 847–65.
Krosnick, Jon A. 1991. Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys. Applied Cognitive Psychology 5: 213–36.
Krosnick, Jon A. 1999. Survey Research. Annual Review of Psychology 50: 537–67.
Leeper, Thomas and Thorson, Emily. 2019. Replication Data for: Should We Worry About Sponsorship-Induced Bias in Online Political Science Surveys? Harvard Dataverse. doi:10.7910/DVN/KKFS8Y.
McDonald, Michael P. 2003. On the Over-Report Bias of the National Election Survey. Political Analysis 11: 180–6.
Nederhof, Anton J. 2006. Methods of Coping with Social Desirability Bias: A Review. European Journal of Social Psychology 15(3): 263–80.
Nichols, Austin Lee and Maner, Jon K.. 2008. The Good-Subject Effect: Investigating Participant Demand Characteristics. The Journal of General Psychology 135(2): 151–66.
Porter, Stephen R and Whitcomb, Michael E.. 2003. The Impact of Contact Type on Web Survey Response Rates. The Public Opinion Quarterly 67(4): 579–88.
Presser, Stanley, Blair, Johnny and Triplett, Timothy. 1992. Survey Sponsorship, Response Rates, and Response Effects. Social Science Quarterly 73: 699702.
Tourangeau, Roger, Groves, Robert, Kennedy, Courtney and Yan, Ting. 2009. The Presentation of a Web Survey, Nonresponse and Measurement Error among Members of Web Panel. Journal of Official Statistics 25: 299321.
Tourangeau, Roger, Presser, Stanley and Sun, Hanyu. 2014. The Impact of Partisan Sponsorship on Political Surveys. Public Opinion Quarterly 78(2): 510–22. URL: http://dx.doi.org/10.1093/poq/nfu020
Weber, Stephen J. and Thomas, D. Cook. 1972. Subject Effects in Laboratory Research: An Examination of Subject Roles, Demand Characteristics, and Valid Infererence. Psychological Bulletin 77(4): 273–95.

Keywords

Type Description Title
UNKNOWN
Supplementary materials

Leeper and Thorson Dataset
Dataset

 Unknown
WORD
Supplementary materials

Leeper and Thorson supplementary material
Leeper and Thorson supplementary material

 Word (340 KB)
340 KB

Should We Worry About Sponsorship-Induced Bias in Online Political Science Surveys?

  • Thomas J. Leeper (a1) and Emily A. Thorson (a2)

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed