Skip to main content
×
Home
    • Aa
    • Aa

Survey Experiments with Google Consumer Surveys: Promise and Pitfalls for Academic Research in Social Science

  • Lie Philip Santoso (a1), Robert Stein (a2) and Randy Stevenson (a3)
Abstract

In this article, we evaluate the usefulness of Google Consumer Surveys (GCS) as a low-cost tool for doing rigorous social scientific work. We find that its relative strengths and weaknesses make it most useful to researchers who attempt to identify causality through randomization to treatment groups rather than selection on observables. This finding stems, in part, from the fact that the real cost advantage of GCS over other alternatives is limited to short surveys with a small number of questions. Based on our replication of four canonical social scientific experiments and one study of treatment heterogeneity, we find that the platform can be used effectively to achieve balance across treatment groups, explore treatment heterogeneity, include manipulation checks, and that the provided inferred demographics may be sufficiently sound for weighting and explorations of heterogeneity. Crucially, we successfully managed to replicate the usual directional finding in each experiment. Overall, GCS is likely to be a useful platform for survey experimentalists.

    • Send article to Kindle

      To send this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Survey Experiments with Google Consumer Surveys: Promise and Pitfalls for Academic Research in Social Science
      Available formats
      ×
      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about sending content to Dropbox.

      Survey Experiments with Google Consumer Surveys: Promise and Pitfalls for Academic Research in Social Science
      Available formats
      ×
      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about sending content to Google Drive.

      Survey Experiments with Google Consumer Surveys: Promise and Pitfalls for Academic Research in Social Science
      Available formats
      ×
Copyright
Corresponding author
e-mail: ls42@rice.edu (corresponding author)
Linked references
Hide All

This list contains references from the content that can be linked to their source. For a full set of references and notes please see the PDF or HTML where available.

Adam J. Berinsky , Michelle F. Margolis , and Michael W. Sances 2014. Separating the shirkers from the workers? Making sure respondents pay attention on self-administered surveys. American Journal of Political Science 58:739–53.

Adam J. Berinsky , Gregory A. Huber , and Gabriel S. Lenz 2012. Evaluating online labor markets for experimental research: Amazon.com's Mechanical Turk. Political Analysis 20:351–68.

John Bremer . 2013. The interaction of sampling and weighting in producing a representative sample online: An excerpt from the ARF's “foundations of quality 2” initiative. Journal of Advertising Research 53:363–71.

Cristel DeRouvray , and Mick P. Couper 2002. Designing a strategy for reducing “no opinion” responses in web-based surveys. Social Science Computer Review 20:39.

James N. Druckman 2001. Evaluating framing effects. Journal of Economic Psychology 22:91101.

Donald P. Green , and Holger L. Kern 2012. Modeling heterogeneous treatment effects in survey experiments with Bayesian additive regression trees. Public Opinion Quarterly 76:491511.

Allyson L. Holbrook , and Jon A. Krosnick 2010. Social desirability bias in voter turnout reports. Public Opinion Quarterly 74:3767.

Allyson Holbrook , Jon A. Krosnick , and Alison Pfent . 2007. The causes and consequences of response rates in surveys by the news media and government contractor survey research firms. In Advances in Telephone Survey Methodology, eds. James M. Lepkowski , Clyde Tucker , J. Michael Brick , Edith de Leeuw , Lilli Japec , Paul J. Lavrakas , Michael W. Link , and Roberta L. Sangster New York: Wiley-Interscience 499528.

Gregory A. Huber , and Celia Paris . 2013. Assessing the programmatic equivalence assumption in question wording experiments: Understanding why Americans like assistance to the poor more than welfare. Public Opinion Quarterly 77:385–97.

Kosuke Imai , Gary King , and Elizabeth Stuart . 2008. Misunderstandings among experimentalists and observationalists about causal inference. Journal of the Royal Statistical Society, Series A 171(2):481502.

Alexander L. Janus 2010. The influence of social desirability pressures on expressed immigration attitudes. Social Science Quarterly 91:928–46.

Jerwen Jou , James Shanteau , and Richard Harris . 1996. An information processing view of framing effects: The role of causal schemas in decision making. Memory & Cognition 24:115.

James G. Kane , Stephan C. Craig , and Kenneth D. Wald 2004. Religion and presidential politics in Florida: A list experiment. Social Science Quarterly 85:281–93.

Anton Kuhberger . 1995. The framing of decisions: A new look at old problems. Organizational Behavior & Human Decision Processes 62:230–40.

Jacob Montgomery , and Joshua Cutler . 2013. Computerized adaptive testing for public opinion surveys. Political Analysis 21(2):141–71.

Stephen L. Morgan , and Christopher Winship . 2007. Counterfactuals and causal inference: methods and principles for social research. Cambridge: Cambridge University Press.

Judea Pearl . 2011. The structural theory of causation. In Causality in the sciences, eds. P. McKay Illari , F. Russo , and J. Williamson Oxford: Oxford University Press 697727.

Kenneth A. Rasinski 1989. The effect of question wording on public support for government spending. Public Opinion Quarterly 53:388–94.

Matthew J. Salganik , and Karen Levy . 2015. Wiki surveys: Open and quantifiable social data collection. PLoS One 10(5):e0123483. doi:10.1371/journal.pone.0123483

Stephen Senn . 1994. Testing for baseline balance in clinical trials. Statistics in Medicine 13:1715–26.

Tse-Hua Shih , and Xitao Fan . 2008. Comparing response rates from web and mail surveys: a meta-analysis. Field Methods 20:249–71.

Matthew J. Streb , Barbara Burrell , Brian Frederick , and Michael A. Genovese 2008. Social desirability effects and support for a female American president. Public Opinion Quarterly 72:7689.

Kazuhisa Takemura . 1994. Influence of elaboration on the framing of decision. Journal of Psychology 128:3339.

Amos Tversky , and Daniel Kahneman . 1981. The framing of decisions and the psychology of choice. Science 211:453–58.

Wei Wang , David Rothschild , Sharad Goel , and Andrew Gelman . 2015. Forecasting elections with non-representative polls. International Journal of Forecasting 31:980–91.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Political Analysis
  • ISSN: 1047-1987
  • EISSN: 1476-4989
  • URL: /core/journals/political-analysis
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×
MathJax
Type Description Title
PDF
Supplementary Materials

Santoso et al. supplementary material
Supplementary Material

 PDF (610 KB)
610 KB

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 26 *
Loading metrics...

Abstract views

Total abstract views: 116 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 23rd April 2017. This data will be updated every 24 hours.