Hostname: page-component-76fb5796d-9pm4c Total loading time: 0 Render date: 2024-04-25T12:21:05.889Z Has data issue: false hasContentIssue false

The Design of Field Experiments With Survey Outcomes: A Framework for Selecting More Efficient, Robust, and Ethical Designs

Published online by Cambridge University Press:  18 September 2017

David E. Broockman*
Affiliation:
Assistant Professor, Stanford Graduate School of Business, Stanford, CA 94305, USA. Email: dbroockman@stanford.edu, https://people.stanford.edu/dbroock/
Joshua L. Kalla
Affiliation:
Graduate Student, Department of Political Science, University of California, Berkeley, CA 94720, USA. Email: kalla@berkeley.edu, http://polisci.berkeley.edu/people/person/joshua-kalla
Jasjeet S. Sekhon
Affiliation:
Robson Professor of Political Science and Statistics, University of California, Berkeley, CA 94720, USA. Email: sekhon@berkeley.edu, http://sekhon.berkeley.edu

Abstract

There is increasing interest in experiments where outcomes are measured by surveys and treatments are delivered by a separate mechanism in the real world, such as by mailers, door-to-door canvasses, phone calls, or online ads. However, common designs for such experiments are often prohibitively expensive, vulnerable to bias, and raise ethical concerns. We show how four methodological practices currently uncommon in such experiments have previously undocumented complementarities that can dramatically relax these constraints when at least two are used in combination: (1) online surveys recruited from a defined sampling frame (2) with at least one baseline wave prior to treatment (3) with multiple items combined into an index to measure outcomes and, (4) when possible, a placebo control. We provide a general and extensible framework that allows researchers to determine the most efficient mix of these practices in diverse applications. Two studies then examine how these practices perform empirically. First, we examine the representativeness of online panel respondents recruited from a defined sampling frame and find that their representativeness compares favorably to phone panel respondents. Second, an original experiment successfully implements all four practices in the context of a door-to-door canvassing experiment. We conclude discussing potential extensions.

Type
Articles
Copyright
Copyright © The Author(s) 2017. Published by Cambridge University Press on behalf of the Society for Political Methodology. 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Authors’ note: This paper previously circulated under the title “Testing Theories of Attitude Change With Online Panel Field Experiments.” Software for planning an experiment using all four practices we describe is available at http://experiments.berkeley.edu. Replication data is available as Broockman, Kalla, and Sekhon (2017), at http://dx.doi.org/10.7910/DVN/EEP5MT. This work was supported by the NARAL Pro-Choice America Foundation, the Signatures Innovations Fellows program at UC Berkeley, UC Berkeley’s Institute for Governmental Studies, and the Office of Naval Research [N00014-15-1-2367]. The studies reported herein were approved by Committee for the Protection of Human Subjects. We thank participants at the 2015 POLMETH meeting and at the University of California, Berkeley’s Research Workshop in American Politics for helpful feedback. Additional feedback was provided by Peter Aronow, Rebecca Barter, Kevin Collins, Alex Coppock, Jamie Druckman, Thad Dunning, Donald Green, Christian Fong, Seth Hill, Dan Hopkins, Gabe Lenz, Winston Lin, Chris Mann, David Nickerson, Kellie Ottoboni, Kevin Quinn, Fredrik Sävje, Yotam Shev-Tom, Bradley Spahn, and Laura Stoker. All remaining errors are our own.

Contributing Editor: R. Michael Alvarez

References

Adams, William C., and Smith, Dennis J.. 1980. Effects of telephone canvassing on turnout and preferences: A field experiment. Public Opinion Quarterly 44(3):389395.Google Scholar
Adida, Claire, Gottlieb, Jessica, Kramon, Eric, and McClendon, Gwyneth. 2016. How coethnicity moderates the effect of information on voting behavior: Experimental evidence from Benin. Working Paper.Google Scholar
Albertson, Bethany, and Lawrence, Adria. 2009. After the credits roll: The long-term effects of educational television on public knowledge and attitudes. American Politics Research 37(2):275300.Google Scholar
Angrist, Joshua D. 1990. ERRATA: Lifetime earnings and the Vietnam era draft lottery: Evidence from social security administrative records. The American Economic Review 80(5):12841286.Google Scholar
Ansolabehere, Stephen, Rodden, Jonathan, and Snyder, James M.. 2008. The strength of issues: Using multiple measures to gauge preference stability, ideological constraint, and issue voting. American Political Science Review 102:215232.Google Scholar
Arceneaux, Kevin. 2007. I’m asking for your support: The effects of personally delivered campaign messages on voting decisions and opinion formation. Quarterly Journal of Political Science 2(1):4365.Google Scholar
Arceneaux, Kevin, and Nickerson, David W.. 2010. Comparing negative and positive campaign messages: Evidence from two field experiments. American Politics Research 38(1):5483.Google Scholar
Arceneaux, Kevin, and Kolodny, Robin. 2009a. Educating the least informed: Group endorsements in a grassroots campaign. American Journal of Political Science 53(4):755770.Google Scholar
Arceneaux, Kevin, and Kolodny, Robin. 2009b. The effect of grassroots campaigning on issue preferences and issue salience. Journal of Elections, Public Opinion and Parties 19(3):235249.Google Scholar
Bailey, Michael A., Hopkins, Daniel J., and Rogers, Todd. 2016. Unresponsive, unpersuaded: The unintended consequences of voter persuasion efforts. Political Behavior 38(3):713746.Google Scholar
Barber, Michael J., Canes-Wrone, Brandice, and Thrower, Sharece. 2017. Ideologically sophisticated donors: Which candidates do individual contributors finance? American Journal of Political Science 61(2):271288.Google Scholar
Barber, Michael J., Mann, Christopher B., Quin Monson, J., and Patterson, Kelly D.. 2014. Online polls and registration-based sampling: A new method for pre-election polling. Political Analysis 22(3):321335.Google Scholar
Barton, Jared, Castillo, Marco, and Petrie, Ragan. 2014. What persuades voters? A field experiment on political campaigning. The Economic Journal 124(574):F293F326.Google Scholar
Berent, Matthew K., Krosnick, Jon A., and Lupia, Arthur. 2016. Measuring voter registration and turnout in surveys: Do official government records yield more accurate assessments? Public Opinion Quarterly 80(3):597621.Google Scholar
Bidwell, Kelly, Casey, Katherine, and Glennerster, Rachel. 2015. DEBATES: The impacts of voter knowledge initiatives in Sierra Leone. Working Paper, Stanford Graduate School of Business. URL: https://www.gsb.stanford.edu/gsb-cmis/gsb-cmis-download-auth/362906.Google Scholar
Bloniarz, Adam, Liu, Hanzhong, Zhang, Cun-Hui, Sekhon, Jasjeet S., and Yu, Bin. 2016. Lasso adjustments of treatment effect estimates in randomized experiments. Proceedings of the National Academy of Sciences 113(27):73837390.Google Scholar
Bloom, Howard S., Orr, Larry L., Bell, Stephen H., Cave, George, Doolittle, Fred, Lin, Winston, and Bos, Johannes M.. 1997. The benefits and costs of JTPA title II-A programs: Key findings from the National Job Training Partnership Act study. The Journal of Human Resources 32(3):549576.Google Scholar
Broockman, David, and Green, Donald. 2014. “Do online advertisements increase political candidates” name recognition or favorability? Evidence from randomized field experiments. Political Behavior 36(2):263289.Google Scholar
Broockman, David E., and Butler, Daniel M.. 2017. The causal effects of elite position-taking on voter attitudes: Field experiments with elite communication. American Journal of Political Science 61(1):208221.Google Scholar
Broockman, David E., and Kalla, Joshua L.. 2016. Durably reducing transphobia: A field experiment on door-to-door canvassing. Science 352(6282):220224.Google Scholar
Broockman, David, Kalla, Joshua, and Sekhon, Jasjeet. 2017. Replication data for: The design of field experiments with survey outcomes: A framework for selecting more efficient, robust, and ethical designs. doi:10.7910/DVN/EEP5MT, Harvard Dataverse, V1, UNF:6:gM7KTUQ0wCS6voY98ZTw5A==.Google Scholar
Brüggen, E., van den Brakel, J., and Krosnick, Jon. 2016. Establishing the accuracy of online panels for survey research. Working Paper. Available at https://www.cbs.nl/en-gb/background/2016/15/establishing-the-accuracy-of-online-panels-for-survey-research.Google Scholar
Cardy, Emily Arthur. 2005. An experimental field study of the GOTV and persuasion effects of partisan direct mail and phone calls. The Annals of the American Academy of Political and Social Science 601(1):2840.Google Scholar
Cheung, Paul. 2005. Designing household survey samples: Practical guidelines . Number 98 in “Studies in Methods Series F”. United Nations.Google Scholar
Collins, Kevin, and Rosmarin, Joshua. 2016. Comparing representativeness in online and live interview phone surveys. Presentation at the 71st Annual Conference of the American Association for Public Opinion Research, May 2016, Austin, Texas.Google Scholar
Conroy-Krutz, Jeffrey, and Moehler, Devra C.. 2015. Moderation from bias: A field experiment on partisan media in a new democracy. Journal of Politics 77(2):575587.Google Scholar
Coppock, Alexander. 2016. Positive, small, homogeneous, and durable: Political persuasion in response to information. Dissertation, Columbia University.Google Scholar
Cubbison, William. 2015. The marginal effects of direct mail on vote choice. Paper presented at the annual meeting of the midwest political science association. URL: http://media.wix.com/ugd/3a8c0a_47330c730f56431f8f982a3d842f434a.pdf.Google Scholar
Dewan, Torun, Humphreys, Macartan, and Rubenson, Daniel. 2014. The elements of political persuasion: Content, charisma and cue. The Economic Journal 124(574):F257F292.Google Scholar
Doherty, David, and Scott Adler, E.. 2014. The persuasive effects of partisan campaign mailers. Political Research Quarterly 67(3):562573.Google Scholar
Druckman, James N., Green, Donald P., Kuklinski, James H., and Lupia, Arthur. 2006. The growth and development of experimental research in political science. American Political Science Review 100(4):627635.Google Scholar
Druckman, James N., and Leeper, Thomas J.. 2012. Learning more from political communication experiments: Pretreatment and its effects. American Journal of Political Science 56(4):875896.Google Scholar
Enos, Ryan D. 2014. Causal effect of intergroup contact on exclusionary attitudes. Proceedings of the National Academy of Sciences 111(10):36993704.Google Scholar
Fearon, James, Humphreys, Macartan, and Weinstein, Jeremy M.. 2009. Development assistance, institution building, and social cohesion after civil war: Evidence from a field experiment in Liberia. Working Paper.Google Scholar
Funk, Cary, and Goo, Sara Kehaulani. 2015. A look at what the public knows and does not know about science. Technical report Pew Research Center.Google Scholar
Gerber, Alan S. 2004. Does campaign spending work? Field experiments provide evidence and suggest new theory. American Behavioral Scientist 47(5):541574.Google Scholar
Gerber, Alan S., Karlan, Dean, and Bergan, Daniel. 2009. Does the media matter? A field experiment measuring the effect of newspapers on voting behavior and political opinions. American Economic Journal: Applied Economics 1(2):3552.Google Scholar
Gerber, Alan S., and Green, Donald P.. 2012. Field experiments: design, analysis, and interpretation . New York: W. W. Norton.Google Scholar
Gerber, Alan S., Huber, Gregory A., and Washington, Ebonya. 2010. Party affiliation, partisanship, and political beliefs: A field experiment. American Political Science Review 104(4):720744.Google Scholar
Gerber, Alan S., Gimpel, James, Green, Donald, and Shaw, Daron. 2011. How large and long-lasting are the persuasive effects of televised campaign ads? Results from a randomized experiment. American Political Science Review 105(1):135150.Google Scholar
Gooch, Andrew, and Vavreck, Lynn. 2016. How face-to-face interviews and cognitive skill affect item non-response: A randomized experiment assigning mode of interview. Political Science Research and Methods , doi:10.1017/psrm.2016.20.Google Scholar
Green, Donald P., Gerber, Alan S., and Nickerson, David W.. 2003. Getting out the vote in local elections: Results from six door-to-door canvassing experiments. Journal of Politics 65(4):10831096.Google Scholar
Hainmueller, Jens. 2012. Entropy balancing for causal effects: A multivariate reweighting method to produce balanced samples in observational studies. Political Analysis 20(1):2546.Google Scholar
Hainmueller, Jens, Hopkins, Daniel J., and Yamamoto, Teppei. 2014. Causal inference in conjoint analysis: Understanding multidimensional choices via stated preference experiments. Political Analysis 22:130.Google Scholar
Hall, Thad E., and Sinclair, Betsy. 2011. The American internet voter. Journal of Political Marketing 10:5879.Google Scholar
Hartman, Erin, Grieve, Richard, Ramsahai, Roland, and Sekhon, Jasjeet S.. 2015. From SATE to PATT: Combining experimental with observational studies to estimate population treatment effects. Journal of the Royal Statistical Society, Series A 178(3):757778.Google Scholar
Heckman, James, Smith, Jeffrey, and Taber, Christopher. 1994. Accounting for dropouts in evaluations of social experiments. URL: http://www.nber.org/papers/t0166.pdf.Google Scholar
Hersh, Eitan D., and Goldenberg, Matthew N.. 2016. Democratic and Republican physicians provide different care on politicized health issues. Proceedings of the National Academy of Sciences 113(42):1181111816.Google Scholar
Hill, Seth J., Lo, James, Vavreck, Lynn, and Zaller, John R.. 2007. The opt-in internet panel: Survey mode, sampling methodology and the implications for political research. Working Paper, available at http://www.allacademic.com/meta/p199541_index.html.Google Scholar
Himelein, Kristen. 2015. The socio-economic impacts of Ebola in Liberia: Results from a high frequency cell phone survey, round 5. Technical report World Bank Group. http://www.worldbank.org/content/dam/Worldbank/document/Poverty%20documents/Socio-Economic%20Impacts%20of%20Ebola%20in%20Liberia,%20April%2015%20(final).pdf.Google Scholar
Humphreys, Macartan, and Weinstein, Jeremy M.. 2012. Policing politicians: Citizen empowerment and political accountability in Uganda preliminary analysis. Working Paper.Google Scholar
Isbell, Thomas A.2016. Data codebook for a round 6 Afrobarometer survey in Liberia. Technical report Afrobarometer. http://afrobarometer.org/sites/default/files/data/round-6/lib_r6_codebook.pdf.Google Scholar
Iyengar, Shanto, and Vavreck, Lynn. 2012. Online panels and the future of political communication research. In The Sage Handbook of Political Communication . Thousand Oaks, CA: Sage, pp. 225240.Google Scholar
Jackman, Simon, and Spahn, Bradley. 2015. Silenced and ignored: How the turn to voter registration lists excludespeople and opinions from political science andpolitical representation. Working Paper, Stanford University available at https://www.dropbox.com/s/qvqtz99i4bhdore/silenced.pdf?dl=0.Google Scholar
Kish, Leslie. 1965. Survey sampling . Hoboken, NJ: Wiley.Google Scholar
Kohut, Andrew, Keeter, Scott, Doherty, Carroll, Dimock, Michael, and Christian, Leah. 2012. Assessing the representativeness of public opinion surveys. URL: http://www.people-press.org/files/legacy-pdf/Assessing%20the%20Representativeness%20of%20Public%20Opinion%20Surveys.pdf.Google Scholar
Lam, Patrick, and Peyton, Kyle. 2013. Voter persuasion in compulsory electorates: Evidence from a field experiment in Australia. URL: http://polmeth.wustl.edu/files/polmeth/ausexp.pdf.Google Scholar
McKenzie, David. 2012. Beyond baseline and follow-up: The case for more T in experiments. Journal of Development Economics 99(2):210221.Google Scholar
Michelson, Melissa R. 2016. The risk of over-reliance on the Institutional Review Board: An approved project is not always an ethical project. PS: Political Science & Politics 49(2):299303.Google Scholar
Miller, Roy E., and Robyn, Dorothy L.. 1975. A field experimental study of direct mail in a congressional primary campaign: What effects last until election day. Experimental study of politics 4(3):136.Google Scholar
Nickerson, David W. 2005a. Partisan mobilization using volunteer phone banks and door hangers. Annals of the American Academy of Political and Social Science 601(1):1027.Google Scholar
Nickerson, David W. 2005b. Scalable protocols offer efficient design for field experiments. Political Analysis 13(3):233252.Google Scholar
Nickerson, David W.2007. Don’t talk to strangers: Experimental evidence of the need for targeting. Presented at the 2007 annual meeting of the midwest political scienec assocation. Available at https://www.scribd.com/document/98714549/Nickerson-independents.Google Scholar
Potter, Philip B. K., and Gray, Julia. 2008. Does costly signaling matter? Preliminary evidence from a field experiment. Working Paper, available at http://www.belfercenter.org/sites/default/files/files/publication/Potter%202008%20FINAL%20DOC.pdf.Google Scholar
Rogers, Todd, and Nickerson, David W.. 2013. Can inaccurate beliefs about incumbents be changed? And can reframing change votes? Working Paper RWP13-018, Harvard Kennedy School. URL: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2271654.Google Scholar
Sadin, Meredith L.2014. Campaigning with class: Ambivalent stereotypes and candidate wealth in U.S. elections. PhD thesis, Princeton.Google Scholar
Sävje, Fredrik, Higgins, Michael, and Sekhon, Jasjeet S.. 2016. Improving massive experiments with threshold blocking. Proceedings of the National Academy of Sciences 113(27):73697376.Google Scholar
Shineman, Victoria Anne. 2016. If you mobilize them, they will become informed: Experimental evidence that information acquisition is endogenous to costs and incentives to participate. British Journal of Political Science , doi:10.1017/S0007123416000168.Google Scholar
Sniderman, Paul M., and Grob, Douglas B.. 1996. Innovations in experimental design in attitude surveys. Annual Review of Sociology 22:377399.Google Scholar
Solomon, Richard L. 1949. An extension of the control group design. Psychological Bulletin 46(2):137150.Google Scholar
Strauss, Aaron B.2009. Political ground truth: How personal issue experience counters partisan biases. PhD thesis, Princeton.Google Scholar
Zaller, John R. 1992. The nature and origins of mass opinion . New York: Cambridge University Press.Google Scholar
Supplementary material: File

Broockman et al. supplementary material

Broockman et al. supplementary material 1

Download Broockman et al. supplementary material(File)
File 483.6 KB