There is increasing interest in experiments where outcomes are measured by surveys and treatments are delivered by a separate mechanism in the real world, such as by mailers, door-to-door canvasses, phone calls, or online ads. However, common designs for such experiments are often prohibitively expensive, vulnerable to bias, and raise ethical concerns. We show how four methodological practices currently uncommon in such experiments have previously undocumented complementarities that can dramatically relax these constraints when at least two are used in combination: (1) online surveys recruited from a defined sampling frame (2) with at least one baseline wave prior to treatment (3) with multiple items combined into an index to measure outcomes and, (4) when possible, a placebo control. We provide a general and extensible framework that allows researchers to determine the most efficient mix of these practices in diverse applications. Two studies then examine how these practices perform empirically. First, we examine the representativeness of online panel respondents recruited from a defined sampling frame and find that their representativeness compares favorably to phone panel respondents. Second, an original experiment successfully implements all four practices in the context of a door-to-door canvassing experiment. We conclude discussing potential extensions.
Authors’ note: This paper previously circulated under the title “Testing Theories of Attitude Change With Online Panel Field Experiments.” Software for planning an experiment using all four practices we describe is available at http://experiments.berkeley.edu. Replication data is available as Broockman, Kalla, and Sekhon (2017), at http://dx.doi.org/10.7910/DVN/EEP5MT. This work was supported by the NARAL Pro-Choice America Foundation, the Signatures Innovations Fellows program at UC Berkeley, UC Berkeley’s Institute for Governmental Studies, and the Office of Naval Research [N00014-15-1-2367]. The studies reported herein were approved by Committee for the Protection of Human Subjects. We thank participants at the 2015 POLMETH meeting and at the University of California, Berkeley’s Research Workshop in American Politics for helpful feedback. Additional feedback was provided by Peter Aronow, Rebecca Barter, Kevin Collins, Alex Coppock, Jamie Druckman, Thad Dunning, Donald Green, Christian Fong, Seth Hill, Dan Hopkins, Gabe Lenz, Winston Lin, Chris Mann, David Nickerson, Kellie Ottoboni, Kevin Quinn, Fredrik Sävje, Yotam Shev-Tom, Bradley Spahn, and Laura Stoker. All remaining errors are our own.
Contributing Editor: R. Michael Alvarez
Email your librarian or administrator to recommend adding this journal to your organisation's collection.
* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.
Usage data cannot currently be displayed