Skip to main content Accessibility help
×
Home

The case for laboratory experiments in behavioural public policy

  • PETER D. LUNN (a1) and ÁINE NÍ CHOISDEALBHA (a2)

Abstract

Behavioural science is increasingly applied to policy in many countries. While the empirical approach to policy development is welcome, we argue with reference to existing literature that laboratory experiments are presently underused in this domain, relative to field studies. Assumptions that field experiments, including randomised controlled trials, produce more generalisable results than laboratory experiments are often misplaced. This is because the experimental control offered by the laboratory allows underlying psychological mechanisms to be isolated and tested. We use examples from recent research on energy efficiency and financial decision-making to argue that mechanism-focused laboratory research is often not only complementary to field research, but also necessary to interpreting field results, and that such research can have direct policy implications. The issues discussed illustrate that in some policy contexts a well-designed laboratory study can be a good – perhaps the best – way to answer the kinds of research questions that policy-makers ask.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      The case for laboratory experiments in behavioural public policy
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      The case for laboratory experiments in behavioural public policy
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      The case for laboratory experiments in behavioural public policy
      Available formats
      ×

Copyright

Corresponding author

*Correspondence to: ESRI, Whitaker Square, Sir John Rogerson's Quay, Dublin 2, Ireland. Email: pete.lunn@esri.ie

References

Hide All
Allcott, H., and Greenstone, M. (2012), ‘Is there an energy efficiency gap?’, The Journal of Economic Perspectives, 26(1): 328.
Allcott, H., and Rogers, T. (2014), ‘The short-run and long-run effects of behavioral interventions: Experimental evidence from energy conservation’, The American Economic Review, 104(10): 30033037.
Allcott, H., and Taubinsky, D. (2015), ‘Evaluating behaviorally motivated policy: experimental evidence from the lightbulb market’, The American Economic Review, 105(8): 25012538.
Amir, O., Ariely, D., Cooke, A., Dunning, D., Epley, N., Gneezy, U., Köszegi, B., Lichtenstein, D., Mazar, N., Mullainathan, S., Prelec, D., Shafir, E. and Silva, J. (2005), ‘Psychology, behavioural economics, and public policy’, Marketing Letters, 16: 443454.
Behavioural Insights Team (2012a), Test, Learn, Adapt, London: Cabinet Office.
Behavioural Insights Team (2012b), Annual Update 2011–2012, London: Cabinet Office.
Boudet, H., Ardoin, N. M., Flora, J., Armel, K. C., Desai, M. and Robinson, T. N. (2016), ‘Effects of a behaviour change intervention for Girl Scouts on child and parent energy-saving behaviours’, Nature Energy, 1: 16091.
Camerer, C. (2011), The Promise and Success of Lab-Field Generalizability in Experimental Economics: A Critical Reply to Levitt and List (SSRN 1977749).
Carroll, J., Lyons, S. and Denny, E. (2014), ‘Reducing household electricity demand through smart metering: The role of improved information about energy saving’, Energy Economics, 45: 234–43.
Cartwright, N. (2007), ‘Are RCTs the gold standard?’, BioSocieties, 2(1): 1120.
Cartwright, N. and Hardie, J. (2012), Evidence Based Policy: A Practical Guide to Doing it Better, Oxford: Oxford University Press.
Crosetto, P., Muller, L. and Ruffieux, B. (2016), ‘Helping consumers with a front-of-pack label: Numbers or colors? Experimental comparison between Guideline Daily Amount and Traffic Light in a diet-building exercise’, Journal of Economic Psychology, 55: 3050.
Deaton, A. (2010), ‘Instruments, Randomization, and Learning about Development’, Journal of Economic Literature, 48: 424455.
Deutsch, M. (2010), ‘Life cycle cost disclosure, consumer behavior, and business implications’, Journal of Industrial Ecology, 14: (1), 103120.
Dolan, P., Hallsworth, M., Halpern, D., King, D. and Vlaev, I. (2010), MINDSPACE: Influencing behaviour through public policy, London: The Cabinet Office/Institute for Government.
Duflo, E. and Kremer, M. (2005), ‘Use of Randomization in the Evaluation of Development Effectiveness’, in Feinstein, O., Ingram, G. K. and Pitman, G. K. (eds), Evaluating Development Effectiveness, New Brunswick, New Jersey and London: Transaction Publishers.
Feigenson, L., Dehaene, S. and Spelke, E. (2004), ‘Core systems of number’, Trends in Cognitive Sciences, 8(7): 307314.
Gneezy, U., List, J. and Price, M. K. (2012), Toward an Understanding of why People Discriminate: Evidence from a series of Natural Field Experiments. NBER Working Paper 17855.
Hardin, C. D. and Banaji, M. R. (2013), ‘The Nature of Implicit Prejudice’, in Shafir, E. (ed.), The Behavioral Foundations of Public Policy, Princeton NJ: Princeton University Press, 1331.
Harrison, G. W. and List, J. L. (2004), ‘Field Experiments’, Journal of Economic Literature, 42(4): 10091055.
Haynes, L., Goldacre, B. and Torgerson, D. (2012), Test, learn, adapt: developing public policy with randomised controlled trials. London: Cabinet Office.
Heinzle, S. L. (2012), ‘Disclosure of energy operating cost information: A silver bullet for overcoming the energy-efficiency gap?’, Journal of Consumer Policy, 35(1): 4364.
Henrich, J., Heine, S. J. and Norenzayan, A. (2010), ‘Beyond WEIRD: Towards a broad-based behavioral science’, Behavioral and Brain Sciences, 33(2–3): 111135.
Huck, S., and Wallace, B. (2010), The impact of price frames on consumer decision making (Office of Fair Trading 1226).
Jaffe, A. B., & Stavins, R. N. (1994), ‘The energy-efficiency gap What does it mean?’, Energy Policy, 22(10): 804810.
Kallbekken, S., Sælen, H., & Hermansen, E. A. (2013), ‘Bridging the energy efficiency gap: A field experiment on lifetime energy costs and household appliances’, Journal of Consumer Policy, 36(1): 116.
Kessler, J., and Vesterlund, L. (2014), ‘External Validity of Laboratory Experiments: The Misleading Emphasis on Quantitative Effects’. in Frechette, G. and Schotter, A. (eds), The Methods of Modern Experimental Economics, Oxford: Oxford University Press.
Levitt, S. D., and List, J. A. (2007a), ‘Viewpoint: On the generalizability of lab behaviour to the field’, Canadian Journal of Economics/Revue canadienne d'économique, 40(2): 347370.
Levitt, S. D., and List, J. A. (2007b), ‘What do laboratory experiments measuring social preferences reveal about the real world?’, The Journal of Economic Perspectives, 21(2): 153174.
List, J. (2004), ‘The nature and extent of discrimination in the marketplace: evidence from the field’, Quarterly Journal of Economics, 119: 4989.
Ludwig, J., Kling, J. R. and Mullainathan, S. (2011), ‘Mechanism experiments and policy evaluations’, The Journal of Economic Perspectives, 25(3): 1738.
Lunn, P. D. (2014), Regulatory Policy and Behavioural Economics, OECD Publishing.
Lunn, P. D., Bohaçek, M. and Rybicki, A. (2016), An experimental investigation of personal loan choices, Dublin: ESRI.
Lynham, J., Nitta, K., Saijo, T. and Tarui, N. (2016), ‘Why does real-time information reduce energy consumption?’, Energy Economics, 54: 173181.
Newell, R. G., and Siikamäki, J. V. (2014), Nudging energy efficiency behavior: The role of information labels, Journal of the Association of Environmental and Resource Economists, 1(4): 555598.
Riach, P. and Rich, J. (2002), ‘Field experiments of discrimination in the market place’, The Economic Journal, 112: F480F518.
Rooth, D-O. (2010), ‘Automatic associations and discrimination in hiring: real world evidence’, Labour Economics, 17(3): 523534.
Shah, A. K. and Oppenheimer, D. M. (2007), ‘Easy does it: The role of fluency in cue weighting’, Judgment and Decision Making, 2(6): 371379.
van Bavel, R., Hermann, B., Esposito, G. and Proestakis, A. (2013), Applying Behavioural Sciences to EU Policy-making. JRC Scientific and Policy Report, Brussels: European Commission.
Whalen, J., Gallistel, C. R., and Gelman, R. (1999), ‘Nonverbal counting in humans: The psychophysics of number representation’, Psychological Science, 10(2): 130137.

The case for laboratory experiments in behavioural public policy

  • PETER D. LUNN (a1) and ÁINE NÍ CHOISDEALBHA (a2)

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed