Hostname: page-component-89b8bd64d-ksp62 Total loading time: 0 Render date: 2026-05-06T11:56:33.827Z Has data issue: false hasContentIssue false

Communicating evidence in icons and summary formats for policymakers: what works?

Published online by Cambridge University Press:  16 April 2021

Cameron Brick*
Affiliation:
Winton Centre for Risk and Evidence Communication, Centre for Mathematical Sciences, University of Cambridge, Cambridge, UK and Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands
Alexandra L.J. Freeman
Affiliation:
Winton Centre for Risk and Evidence Communication, Centre for Mathematical Sciences, University of Cambridge, Cambridge, UK
*
*Correspondence to: Winton Centre for Risk and Evidence Communication, Centre for Mathematical Sciences, University of Cambridge, Cambridge CB3 0WA, UK. Email: brickc@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

Policy decisions have vast consequences, but there is little empirical research on how best to communicate underlying evidence to decision-makers. Groups in diverse fields (e.g., education, medicine, crime) use brief, graphical displays to list policy options, expected outcomes and evidence quality in order to make such evidence easy to assess. However, the understanding of these representations is rarely studied. We surveyed experts and non-experts on what information they wanted and tested their objective comprehension of commonly used graphics. A total of 252 UK residents from Prolific and 452 UK What Works Centre users interpreted the meaning of graphics shown without labels. Comprehension was low (often below 50%). The best-performing graphics combined unambiguous metaphorical shapes with color cues and indications of quantity. The participants also reported what types of evidence they wanted and in what detail (e.g., subgroups, different outcomes). Users particularly wanted to see intervention effectiveness and quality, and policymakers also wanted to know the financial costs and negative consequences. Comprehension and preferences were remarkably consistent between the two samples. Groups communicating evidence about policy options can use these results to design summaries, toolkits and reports for expert and non-expert audiences.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press
Figure 0

Figure 1. An example of a What Works toolkit summarizing the cost, evidence quality (‘Evidence Strength’) and effectiveness (‘Impact’) of various educational interventions. For clarity, font sizes were increased and some text was removed. Copyright: Education Endowment Foundation (2020), used with permission.

Figure 1

Table 1. Key measures and response options.

Figure 2

Figure 2. Summary display of effectiveness for an intervention from Conservation Evidence. Copyright: Conservation Evidence (2020), used with permission.

Figure 3

Table 2. Survey participation and attrition by Centre (expert sample only).

Figure 4

Table 3. Demographics for both samples.

Figure 5

Table 4. Icon comprehension: effectiveness and quality of evidence (main icons).

Figure 6

Figure 3. Both samples had similar overall comprehension, shown in overlaid histograms of objective comprehension percentage (25 items) for expert and non-expert users.

Figure 7

Table 5. Evidence priority rank by sample (highest priority = 1).

Figure 8

Table 6. User goals (expert sample only).

Supplementary material: File

Brick and Freeman supplementary material

Brick and Freeman supplementary material
Download Brick and Freeman supplementary material(File)
File 326.7 KB