Skip to main content Accessibility help
×
Home
Hostname: page-component-747cfc64b6-9ng7f Total loading time: 0.428 Render date: 2021-06-15T23:41:51.954Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true }

How others drive our sense of understanding of policies

Published online by Cambridge University Press:  07 September 2020

NATHANIEL RABB
Affiliation:
The Policy Lab, Brown University, Providence, RI, USA
JOHN J. HAN
Affiliation:
Tepper School of Business, Carnegie Mellon University, Pittsburgh, PA, USA
STEVEN A. SLOMAN
Affiliation:
Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
Corresponding
E-mail address:
Rights & Permissions[Opens in a new window]

Abstract

Five experiments are reported to compare models of attitude formation about hot-button policy issues like climate change. In broad strokes, the deficit model states that incorrect opinions are a result of a lack of information, while the cultural cognition model states that opinions are formed to maximize congruence with the group that one affiliates with. The community of knowledge hypothesis takes an integrative position. It states that opinions are based on perceived knowledge, but that perceptions are partly determined by the knowledge that sits in the heads of others in the community. We use the fact that people's sense of understanding is affected by knowledge of others’ understanding to arbitrate among these views in the domain of public policy. In all experiments (N = 1767), we find that the contagious sense of understanding is nonpartisan and robust to experimental manipulations intended to eliminate it. While ideology clearly affects people's attitudes, sense of understanding does as well, but level of actual knowledge does not. And the extent to which people overestimate their own knowledge partly determines the extremity of their position. The pattern of results is most consistent with the community of knowledge hypothesis. Implications for climate policy are considered.

Type
Article
Copyright
Copyright © The Author(s) 2020. Published by Cambridge University Press

Introduction

What drives Americans’ attitudes toward climate change? The rational ideal emerging from the Enlightenment is that attitudes are based on knowledge and knowledge is based on evidence. So why do Americans have diverging views about the causes of climate change (Egan & Mullin, Reference Egan and Mullin2017; Leiserowitz et al., Reference Leiserowitz, Maibach, Roser-Renouf, Rosenthal, Cutler and Kotcher2018) – never mind what to do about it – when climate scientists do not? One explanation is that climate change is hard to understand; people who doubt its occurrence or its roots in human activity have insufficient knowledge because they have encountered insufficient evidence. We refer to the idea that lack of knowledge predicts attitudes, and particularly opposition, to policies meant to counteract climate change as the ‘deficit model’.

Indeed, climate change is hard to understand. It involves many unobservable mechanisms contributing to a pattern detectable only in large-scale longitudinal data, with temporally distant and probabilistic consequences. Measures of climate change knowledge show generally poor understanding of its mechanisms (Bedford, Reference Bedford2016; Ranney & Clark, Reference Ranney and Clark2016) and of the geographical regions most immediately affected (Hamilton, Reference Hamilton2015). Unsurprisingly, lay knowledge trails that of experts (Sundblad et al., Reference Sundblad, Biel and Gärling2009). This makes the deficit model plausible. It is neither rational nor adaptive to destroy one's own habitat, and the conclusion that such destruction is likely follows directly from scientific information. So individuals who are not moved toward compensatory actions or even belief in the cause of the destruction must lack this information. The factors that drive denial of anthropogenic climate change just do not make sense when one takes a view beyond only a few years into the future.

The deficit model has its detractors, but its core implication – provide more information and people's attitudes will change – is evident in many science communication efforts (Suldovsky, Reference Suldovsky2017). It also enjoys some empirical support: measures of climate change knowledge do predict concern (Shi et al., Reference Shi, Visschers, Siegrist and Arvai2016), and providing explanations of climate change mechanisms increases confidence that it is anthropogenic (Ranney & Clark, Reference Ranney and Clark2016). Measurable knowledge also affects attitudes toward other controversial policy domains where scientific information is critical (e.g., genetically modified foods – McPhetres et al., Reference McPhetres, Rutjens, Weinstein and Brisson2019; energy – Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2016), as well as attitudes toward science more generally (Bak, Reference Bak2001).

Yet citizens act regardless of their level of understanding: they vote on policies that involve complex systems with opaque mechanisms and serious yet hard-to-predict consequences (see Rhodes et al., Reference Rhodes, Axsen and Jaccard2014), they vote for politicians that promise to enact or block such policies and they respond to polls that exert pressure on policymakers’ priorities and decisions. Given how difficult it is to understand climate change and climate-related policies, how are people supposed to form the attitudes that govern action if not by acquiring more accurate information?

An alternative to the deficit model – the cultural cognition model (Kahan & Braman, Reference Kahan and Braman2006) – effectively says that they don't. Instead, people assess information in a manner that predisposes them toward desired conclusions, and these desired conclusions are the beliefs and attitudes of their group. This biased assessment can result from a host of mechanisms: various species of motivated reasoning such as overweighting attitude-consistent information, selectively ignoring attitude-inconsistent information or failing to detect flaws in others’ reasoning when doing so will be ideologically expedient; affective responses; preferences for in-group informants; and memory effects such as the availability heuristic (Kahan et al., Reference Kahan, Jenkins-Smith and Braman2011). These processes are enacted to allow people to adopt their group's beliefs, and there is strong pressure to do so in order to avoid becoming an outcast. Evidence for this model is that many policy-related judgments are made along party political lines (Cohen, Reference Cohen2003; Smith et al., Reference Smith, Ratliff and Nosek2012; Bolsen et al., Reference Bolsen, Druckman and Cook2014; Colombo & Kriesi, Reference Colombo and Kriesi2017; Ehret et al., Reference Ehret, Van Boven and Sherman2018; Satherley et al., Reference Satherley, Yogeeswaran, Osborne and Sibley2018), a strategy that some political scientists consider reasonable in an information-poor environment (Lupia & McCubbins, Reference Lupia and McCubbins1998; Lau & Redlawsk, Reference Lau and Redlawsk2001; Gilens & Murakawa, Reference Gilens and Murakawa2002). Less reasonable is that even nonpolitical judgments are made along party lines when political cues are available (e.g., Iyengar & Westwood, Reference Iyengar and Westwood2015; Nicholson et al., Reference Nicholson, Coe, Emory and Song2016; Marks et al., Reference Marks, Copland, Loh, Sunstein and Sharot2019), findings that are difficult to explain under the deficit model. Most relevant here is that extensive evidence indicates that climate change beliefs and attitudes divide along party lines in the USA (Guber, Reference Guber2013; McCright et al., Reference McCright, Dunlap and Xiao2013; Marquart-Pyatt et al., Reference Marquart-Pyatt, McCright, Dietz and Dunlap2014; Hornsey et al., Reference Hornsey, Harris, Bain and Fielding2016) and ideological lines in many other countries (Poortinga et al., Reference Poortinga, Whitmarsh, Steg, Böhm and Fisher2019), although the strength of this relationship varies by country (Tranter & Booth, Reference Tranter and Booth2015; Hornsey et al., Reference Hornsey, Harris and Fielding2018; Smith & Mayer, Reference Smith and Mayer2019). Moreover, some studies report little or no relation between climate change knowledge and ideology, despite the usual ideological divisions in attitudes (Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2014; Bedford, Reference Bedford2016). This challenges the deficit model's prediction on the assumption that those who understand the most should be in ideological groups whose attitudes are most consistent with the scientific consensus. Other studies find that actual knowledge and ideology make separate contributions to climate change attitudes (Tobler et al., Reference Tobler, Visschers and Siegrist2012; Shi et al., Reference Shi, Visschers and Siegrist2015).

While the cultural cognition model effectively bypasses knowledge by positing that encountered information will be trusted, discredited or weighted according to its consistency with group attitudes, thus establishing a direct link between one's group and one's attitudes, a third body of evidence suggests that knowledge dynamics do play a role in attitude formation. Measures of general reasoning ability – educational attainment, science literacy, numeracy – are associated with greater polarization rather than convergence on controversial and scientifically complex issues (Kahan et al., Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel2012, Reference Kahan, Peters, Dawson and Slovic2017; Hamilton et al., Reference Hamilton, Hartter and Saito2015; Drummond & Fischhoff, Reference Drummond and Fischhoff2017). A common interpretation of this relationship is that higher education and reasoning ability lead to greater knowledge of community leaders’ positions (Zaller, Reference Zaller1992) and capacity to reason toward desired conclusions, namely those of one's group (Kahan et al., Reference Kahan, Peters, Dawson and Slovic2017). But measures of domain-specific knowledge show different patterns. Rather than a positive or no relation between such knowledge and attitudes, actual and self-assessed knowledge together are related to attitude extremity. Specifically, those who have the most extreme attitudes are also the most miscalibrated – that is, they overestimate their own knowledge to the greatest extent. This relation has been reported for genetically modified foods (Fernbach et al., Reference Fernbach, Light, Scott, Inbar and Rozin2019) and the European refugee crisis (van Prooijen et al., Reference van Prooijen, Krouwel and Emmer2018). Fernbach et al. (Reference Fernbach, Light, Scott, Inbar and Rozin2019) found a directionally similar but nonsignificant pattern for climate change, and Hamilton (Reference Hamilton2018) found that the groups traditionally skeptical of climate change show the largest overestimation of their own climate change knowledge. In a related finding, the extent to which people believe they know more than doctors about the causes of autism negatively predicts both actual knowledge of autism and support for mandatory vaccination (Motta et al., Reference Motta, Callaghan and Sylvester2018). This overestimation may be a feature of metacognitive processes; extremity of political views is also associated with miscalibration on a purely perceptual decision-making task (Rollwage et al., Reference Rollwage, Dolan and Fleming2018; see also Ortoleva & Snowberg, Reference Ortoleva and Snowberg2015; Stone, Reference Stone2019).

The community of knowledge

We offer a third hypothesis to accommodate the range of data: the community of knowledge. This hypothesis emphasizes that knowledge is a collective enterprise; people depend on others to represent most of their understanding of complex phenomena as well as the evidence that supports that understanding (Sloman & Fernbach, Reference Sloman and Fernbach2018). On this view, most individual reasoning employs simple causal models that include markers indicating that more information – including the kind of mechanistic details that most people lack for complex phenomena like global climate change – can be found outside of the individual. This system is effective because it affords group actions requiring complex knowledge without any single member of the group possessing all of that knowledge. But it also ties metacognitive assessments of one's own knowledge to knowledge in fact held by others. Empirically, people's sense of understanding of complex phenomena increases when they become aware that relevant experts understand the phenomena, even in the absence of any explanatory information (Sloman & Rabb, Reference Sloman and Rabb2016).

A large variety of evidence supports the community of knowledge hypothesis (reviewed in Rabb et al., Reference Rabb, Fernbach and Sloman2019). The hypothesis explains, for instance, why reports by the Intergovernmental Panel on Climate Change have so many authors: numerous individuals are needed to represent the range of expertise required to understand climate change, with each individual's knowledge contributing a small amount to a much larger web (see Hardwig, Reference Hardwig1985). More directly, the hypothesis accounts for the illusion of explanatory depth – the finding that respondents asked to estimate their own understanding of an object both before and after explaining how it works downgrade their estimates. This effect has been shown for ordinary artifacts and natural phenomena (Rozenblit & Keil, Reference Rozenblit and Keil2002), psychiatric disorders (Zeveney & Marsh, Reference Zeveney and Marsh2016), historical events (Gaviria et al., Reference Gaviria, Corredor and Zuluaga-Rendón2017) and political policies (Fernbach et al., Reference Fernbach, Rogers, Fox and Sloman2013; Vitriol & Marsh, Reference Vitriol and Marsh2018; Voelkel et al., Reference Voelkel, Brandt and Colombo2018), including climate-protective behavioral nudges (Bromme et al., Reference Bromme, Thomm and Ratermann2016). The community of knowledge hypothesis holds that this overestimation occurs because people feel that they understand an object when someone else does (i.e., that they conflate others’ knowledge with their own; for a discussion, see Rabb et al., Reference Rabb, Fernbach and Sloman2019). In principle, this sense of understanding is often sufficient for action; as long as an individual has the general (macro) causal structure right, the mechanistic (micro) details are not important because those details typically can be retrieved if needed. Some findings from climate change studies suggest this dynamic. Mumpower et al. (Reference Mumpower, Liu and Vedlitz2016) found that ratings of the extent to which scientists understand climate change consequences significantly (albeit weakly) predicted perceived climate change risk, while respondents’ actual climate change knowledge did not. And the fact that awareness of scientific consensus sometimes moves people's climate change beliefs (Lewandowsky et al., Reference Lewandowsky, Gignac and Vaughan2013; van der Linden et al., Reference van der Linden, Leiserowitz, Feinberg and Maibach2015) can be interpreted as evidence of people's reliance on collective knowledge.

Such consensus information can be overridden by partisan cues (Bolsen & Druckman, Reference Bolsen and Druckman2018), as the cultural cognition model would expect. According to the community of knowledge hypothesis, revealing a community's position on an issue should have an influence – indeed, a big influence – on attitudes precisely because it is this community that has much of the knowledge. This explains the importance of partisanship. But the fact that perceived (as opposed to actual) understanding is sensitive to the knowledge of one's community (Sloman & Rabb, Reference Sloman and Rabb2016) implies that the most miscalibrated individuals are the most dependent on their community's knowledge. This explains why the most miscalibrated are more likely to assume the perspectives belonging to their community in their purest (most extreme) form (Motta et al., Reference Motta, Callaghan and Sylvester2018; van Prooijen et al., Reference van Prooijen, Krouwel and Emmer2018; Fernbach et al., Reference Fernbach, Light, Scott, Inbar and Rozin2019). A converse result is that people who are made to appreciate their own miscalibration sometimes moderate the extremity of their attitudes in a repeat test (Fernbach et al., Reference Fernbach, Rogers, Fox and Sloman2013; but see Voelkel et al., Reference Voelkel, Brandt and Colombo2018). In sum, there are reasons to suspect that perceived understanding, which covaries with others’ understanding, plays a role in attitude formation independent of both group identity and actual understanding.

The issues here have practical significance. If the community of knowledge hypothesis is correct, then the vast majority of the knowledge that should influence attitudes about climate change is necessarily held by a community that includes large numbers of experts and is out of reach of individuals, certainly non-experts. The goal, therefore, of public education should not be to endow individuals with rich understanding of climate change science or the cognitive skills required to evaluate claims. Instead, the goal should be to facilitate the minimal individual understanding sufficient to accept the coarse causal accounts provided by those who do have thorough understanding in order to foster trust in their expertise.

Current experiments

The current experiments first address whether people's sense of understanding of political policies is affected by learning that others understand them. We also investigate whether the effect follows ideological lines. There are two reasons to think that it would: trust in scientific expertise has become a partisan issue, with liberals expressing more faith in science than conservatives (Hamilton et al., Reference Hamilton, Hartter and Saito2015; Funk & Kennedy, Reference Funk and Kennedy2016); and understanding judgments are metacognitive assessments, other varieties of which can be influenced by partisan cues (Ramirez & Erickson, Reference Ramirez and Erickson2014; Graham, Reference Graham2018). Of course, veridical causal knowledge is inherently useful for interacting with one's environment, regardless of its source. If the community of knowledge view is correct, then people can incur real costs for failing to depend on others’ knowledge simply because those others come from different ideological groups if the knowledge is accurate.

We also use the contagious understanding paradigm to compare the three accounts of the relation between knowledge and attitude determination reviewed above (see Figure 1 & Table 1). The deficit model states that people's attitudes are governed by how much they personally know and do not know. As such, the model makes no predictions about the effect of contagious sense of understanding; rather, it implies that people's sense of understanding is governed by their actual understanding, and that this true understanding will predict attitudes. The cultural cognition model predicts that people will give positive responses to claims supported by their community and negative responses to claims that are rejected. Although it too makes no predictions about the relationship between perceived understanding and individual attitudes, its emphasis on adherence to group attitudes suggests a reverse relation (i.e., that the groups one affiliates with (and their associated attitudes) will influence sense of understanding); perceived understanding judgments would therefore reflect a kind of expressive responding (Hamlin & Jennings, Reference Hamlin and Jennings2011). The community of knowledge view predicts that perceived understanding will be contagious whenever the external source of that sense of understanding has some expertise, whenever it is an agent who can be trusted to have relevant knowledge, and that both perceived understanding and group identification will influence attitudes.

Figure 1. Three views of the determinants of attitudes in scientifically informed policy domains. (I) The deficit model predicts that actual understanding will determine both one's sense of understanding and one's attitudes. (II) The cultural cognition model predicts that group attitudes will determine individual attitudes (and arguably one's sense of understanding, although it makes no explicit predictions about metacognition). The model posits a variety of information-processing mechanisms (motivated reasoning, affective weighting, etc.). These are subsumed by the causal relation shown because they require an a priori conclusion (in this case, the attitudes of the group) to guide reasoning. (III) The community of knowledge hypothesis predicts that individual attitudes will be sensitive to both group attitudes (since the group possesses most of the knowledge) and one's sense of understanding (which is partly determined by group knowledge). These two factors are often aligned, but they diverge for some groups in controversial domains like climate change. On this model, individual knowledge is subsumed by group knowledge because of their interdependence.

Table 1. Predictions for three models tested by Experiments 1–4 along with broad characterization of findings.

Experiments 1a, 1b and 2

Experiment 1a served as an initial test of whether people experience a contagious sense of understanding in the domain of policy, and if they do, whether it is susceptible to partisan influence. If it is, we might expect liberal respondents (who typically report more trust in science) and those that oppose President Trump (who has publicly questioned traditional experts)Footnote 1 to show the effect, but for conservative respondents and those that support Trump not to. We also tested whether patterns would differ depending on whether a policy concerns controversial or noncontroversial domains, or whether the people said to understand the domain are traditional or nontraditional but plausible experts.

Experiment 1b replicated the findings of Experiment 1a in a within-participants design, where participants could in principle notice the knowledge manipulation and thus engage in intentional expressive responding. That is, liberals might show a contagious sense of understanding effect because they wish to demonstrate their trust in scientists, while conservatives would show none.

Experiment 2 further examined partisan influence by varying the stated intention of the survey. Partisan patterns in the contagious sense of understanding should be more likely to emerge when respondents believe they are being queried for a study about political opinion rather than nonpolitical research about understanding how things work (cf., Unsworth & Fielding, Reference Unsworth and Fielding2014).

Method

We used the TurkPrime panel to recruit equal numbers of conservative and liberal adult US respondents from Amazon's Mechanical Turk, a convenience sample that has been shown to be representative of the larger population on various psychological dimensions when sorted by political ideology (Clifford et al., Reference Clifford, Jewell and Waggoner2015). The contagious sense of understanding has been repeatedly replicated but is typically a small to medium size effect; sample sizes (ns ≈ 400) were selected to achieve power to detect a medium effect (d = 0.4, α = 0.05, 1 – β = 0.8) within ideological groups.

Instructions adapted from Rozenblit and Keil (Reference Rozenblit and Keil2002) explained how to use the rating scale to reflect different levels of understanding: three examples of what one might know about crossbows illustrated the depth of causal knowledge that ratings of one, four or seven should indicate. Participants then read passages about various political policies (see Table 2). No information about how a given policy might work was provided, but relevant experts were said to understand (CK, for Community Knowledge condition) or not to understand (no-CK) how the policy will influence subsequent events. Issue contentiousness was manipulated by using policies that concerned hot-button issues (e.g., immigration) or non-hot-button issues (e.g., highway improvements). The type of experts that did or did not understand the issue was also manipulated; some were scientists or academics (traditional), while others were nonacademic but plausibly knowledgeable groups (nontraditional; e.g., homeland security officers). After reading each item, participants rated on a seven-point scale how well they understood how the policy would influence subsequent events. Experiment 2 added a survey purpose manipulation: half of the participants were told that the survey was political (concerning “people's views about political issues”), while the other half were told that it was nonpolitical (“how people understand complex objects and events”). This manipulation appeared both in the instructions and at the top of each policy understanding page. All experiments concluded with four additional measures: beliefs in the extent to which scientists possess expertise that should influence policy (1 = not at all to 5 = very much), with Experiments 1b and 2 adding a comparable measure for nontraditional experts; political ideology (1 = very conservative to 7 = very liberal); opposition to President Trump (1 = strongly support to 7 = strongly against); and basic demographic information, including party affiliation. Follow-up tests indicated that the TurkPrime selection procedure was effective; respondents’ self-reported party affiliations and political ideologies were consistent with experimental groupings.

Table 2. Example items. Italicized text indicates phrases that were experimentally manipulated for the expert type and community knowledge factors. See Supplementary Materials for complete items.

Results and discussion

Participants who took the survey more than once or reported having taken a similar experiment previously were excluded from the analysis (after exclusions: n exp.1a = 382; M age = 38.6, SD = 10.9; 55.8% female; n exp.1b = 388; M age = 38.2, SD = 11.6; 52.2% female; n exp.2 = 235;Footnote 2M age = 38.8, SD = 13.3; 51.9% female). Unsurprisingly, opposition to President Trump split along ideological lines (Experiment 1a: M conservative = 2.83, M liberal = 6.23, t(380) = –19.58, p < 0.001, d = 2.0; Experiment 1b: M conservative = 2.96, M liberal = 6.02, t(386) = –16.87, p < 0.001, d = 1.72; Experiment 2: M conservative = 2.52, M liberal = 6.2, t(233) = –3.68, p < 0.001, d = 2.39).

Mean understanding ratings from all experiments are shown in Figure 2. Here, we summarize key results; any unreported main effects or interactions were nonsignificant (see Supplementary Materials for complete analysis of variance (ANOVA) tables). For Experiment 1a, an ANOVA with issue contentiousness (hot-button/non-hot-button) as a within-participants factor and community knowledge (no-CK/CK), expert type (traditional/nontraditional) and ideology (conservative/liberal) as between-participants factors showed a main effect of community knowledge, F(1, 374) = 4.20, p = 0.041, η p2 = 0.011, M no-CK = 2.67, M CK = 2.94, but no community knowledge by ideology interaction, F < 0.001, p = 0.99. A main effect of issue contentiousness, F(1, 374) = 11.57, p = 0.001, η p2 = 0.03, was due to slightly higher reported understanding for hot-button (M = 2.88) than for non-hot-button (M = 2.74) issues.

Figure 2. Mean understanding judgments of public policies when experts ostensibly do (dark) or do not (light) understand the policies. Top row: mean understanding by ideological group in all experiments. Middle row (Experiment 3): mean understanding of policies concerning hot-button issues when the stated purpose of the experiment was “human-caused climate change” (left) or “complex objects or events” (right) collapsed over ideological group. Bottom row (Experiment 4): mean understanding of climate change-related policies under the same survey purpose conditions collapsed over ideological group. All comparisons p < 0.05 except climate change/climate change condition, p = 0.239. Error bars show SEM.

The overall pattern was replicated with community knowledge manipulated within participants in Experiment 1b: a main effect of community knowledge, M no-CK = 2.65, M CK = 3.59, F(1, 188) = 73.41, p < 0.001, η p2 = 0.28, but no ideology by community knowledge interaction, F = 0.02, p = 0.881. Again, a main effect of issue contentiousness, F(1, 188) = 7.77, p = 0.006, η p2 = 0.04, was due to slightly higher understanding of hot-button (M = 3.2) than non-hot-button issues (M = 3.04). Issue contentiousness interacted with ideology and community knowledge, F(1, 188) = 5.57, p = 0.019, η p2 = 0.029, because conservatives (M = 2.75) reported slightly higher understanding than liberals (M = 2.5) for hot-button issues not understood by experts.

Experiment 2 also showed the same pattern (main effect of community knowledge, M no-CK = 3.07, M CK = 3.81, F(1, 137) = 45.48, p < 0.001, η p2 = 0.25, no ideology by community knowledge interaction, F = 0.06, p = 0.8). The survey purpose manipulation showed no effect (F = 0.62, p = 0.434) and did not interact with any other variables. Issue contentiousness interacted with ideology, F(1, 137) = 5.82, p = 0.017, η p2 = 0.041, such that conservatives reported greater understanding of hot-button (M = 3.72) relative to non-hot-button (M = 3.38) issues, while liberals did not (M hot-button = 3.3, M non-hot-button = 3.36).

These results indicate that the contagious sense of understanding is robust for policy; people reported greater understanding of policies when experts understand them for controversial and mundane issues, when traditional and nontraditional experts are said to understand them, in political and nonpolitical surveys and using within- and between-participants designs. Most importantly, they did so regardless of their own location on the ideological spectrum. In contrast, explicit beliefs about whether scientists possess knowledge that should inform policy did vary by ideology, although the difference was moderateFootnote 3 (Experiment 1a: M conservative = 3.35, M liberal = 4.04; t(380) = –7.97, p < 0.001, d = 0.82; Experiment 1b: M conservative = 3.52, M liberal = 4.15; t(386) = –7.45, p < 0.001, d = 0.75; Experiment 2: M conservative = 3.39, M liberal = 4.02, t(233) = –5.3, p < 0.001, d = 0.69), and liberals were less sanguine about nontraditional experts informing policy (Experiment 1b nontraditional: M conservative = 3.14, M liberal = 3.32, t(386) = –2.1, p = 0.035, d = 0.21; Experiment 2 nontraditional: M conservative = 3.2, M liberal = 3.19, t(233) = 0.034, p = 0.969, d = 0.01). Moreover, trust in scientists to inform policy was negatively associated with both conservatism (r exp.1a = –0.47, p < 0.001; r exp.1b = –0.4, p < 0.001; r exp.2 = -0.47, p < 0.001) and support for President Trump (r exp.1a = –0.47, p < 0.001; r exp.1b = –0.37, p < 0.001; r exp.2 = –0.44, p < 0.001). In sum, respondents who were less in favor of expert knowledge informing policy nevertheless reported a greater sense of understanding about how policies work when they believed these experts understand them.Footnote 4

Experiment 3

Skepticism of human-caused climate change is uniquely predicted by political ideology when compared with skepticism of genetically modified foods and vaccine safety (Rutjens et al., Reference Rutjens, Sutton and van der Lee2018), and the partisan gap in support for spending on broad concerns (e.g., defense, schools, poverty) has widened more for the environment than for any other issue over the last 30 years in the USA (Egan & Mullin, Reference Egan and Mullin2017). If any issue is capable of violating the nonpartisan pattern of contagious sense of understanding, we would expect it to be climate change. Although some policies (post-hurricane relief, seawall construction) in the previous experiments concerned climate issues, Experiment 3 introduces a policy item that refers to climate change by name. As in all of the other items, the details of the policy are intentionally absent. The experiment also measures attitudes toward the hot-button policies and potentially strengthens the survey purpose manipulation by stating in the political condition that the study concerns anthropogenic climate change, rather than political views more generally.

Method

Experiment 3 used a modification of the design of Experiment 2. First, one hot-button item was replaced with a climate change policy (see Table 2). Second, the survey purpose manipulation stated the experiment was about “human-caused climate change” in the political condition. As in Experiment 2, this manipulation appeared both in the instructions and above every policy understanding judgment. The climate change policy appeared first in the climate change purpose condition and last in the understanding purpose condition. Third, four follow-up measures probing participants’ support for the individual hot-button policies (e.g., “How would you rate your support for a bill to build infrastructure that would reduce the impact of climate change on coastal areas?”) were added at the end of the experiment. We dropped the expert type manipulation since this factor had not made a difference in the preceding experiments.

Results and discussion

Exclusion criteria from previous experiments were used (n = 383; M age = 39.1, SD = 12.6; 48.8% female). Trump opposition followed ideological lines (M conservative = 2.27, M liberal = 6.69, t(381) = –4.42, p < 0.001, d = 3.53).

An ANOVA on understanding judgments revealed a main effect of community knowledge, M no-CK = 2.63, M CK = 3.49, F(1, 379) = 133.82, p < 0.001, η p2 = 0.26, but no ideology by community knowledge interaction (F = 0.75, p = 0.386). Issue contentiousness showed a reliable effect, F(1, 379) = 47.44, p < 0.001, η p2 = 0.11, and interacted with survey purpose, F(1, 379) = 8.15, p = 0.005, η p2 = 0.021, such that the difference between understanding ratings of non-hot-button and hot-button issues was larger in the understanding purpose condition (M hot-button = 3.18, M non-hot-button = 2.78; t(190) = 7.13, p < 0.001, d = 0.52) than in the climate change purpose condition (M hot-button = 3.23, M non-hot-button = 3.07; t(191) = 2.79, p = 0.006, d = 0.2).

Nevertheless, inspection of means shows that the survey purpose manipulation did affect one item: climate change policy. In the climate change purpose condition, the contagious sense of understanding was eliminated for conservatives, M no-CK = 3.02, M CK = 3.27, t(90) = 0.69, p = 0.491, d = 0.14, and, surprisingly, for liberals as well, M no-CK = 2.98, M CK = 3.14, t(98) = 0.96, p = 0.339, d = 0.19 (see Figure 2).

Beliefs about scientists’ expertise informing policy followed the now-familiar pattern, varying by ideology (M conservative = 3.50, M liberal = 4.4, t(381) = –10.77, p < 0.001, d = 1.09) and negatively correlating with conservatism (r = –0.52, p < 0.001) and support for Trump (r = –0.54, p < 0.001). Measures of support for the specific hot-button policies also showed a predictable partisan pattern (see Figure 3). Relative to conservatives, liberals were more supportive of legislation on climate change (M conservative = 3.3, M liberal = 4.37, t(381) = –11.23, p < 0.001, d = 1.15), assault rifle attachments (M conservative = 3.04, M liberal = 4.51, t(381) = –12.5, p < 0.001, d = 1.28) and immigration (M conservative = 2.81, M liberal = 4.12, t(381) = –11.94, p < 0.001, d = 1.22); this ideological difference shrank on the issue of protecting natural gas reserves (M conservative = 3.64, M liberal = 3.87, t(381) = –2.51, p = 0.013, d = 0.26).

Figure 3. Support for political policies in Experiments 3 and 4 by ideological group (light = liberal, dark = conservative). All comparisons p < 0.05. Error bars show SEM.

Experiment 4

Results thus far indicate that community knowledge has a reliable and nonpartisan effect on understanding judgments in the domain of public policy, including climate-related policies, with the sole exception of highly politicized climate policy. Despite clear ideological divisions in support for the policies and in enthusiasm for scientists’ knowledge informing them, there was no comparable division in the contagious sense of understanding conferred by awareness of this knowledge. This is a departure from the many partisan differences, including metacognitive assessments, reported in the literature.

Yet several questions remain. First, the deficit and cultural cognition models seem incomplete because neither view predicts the contagious sense of understanding seen here or the knowledge dynamics discussed above. But defenders of the deficit model would be right to point out that we did not measure actual knowledge, so relationships between domain knowledge, sense of understanding and attitudes are unknown. Moreover, the observed partisan divisions in policy support are predicted by the cultural cognition model, so the fact that one's sense of understanding varies with awareness of others’ knowledge, although interesting in its own right, says little about the attitudes that likely determine action. Second, it is possible that respondents assumed the experts in our scenarios were politically neutral or members of their own ideological groups. If only neutral or ideologically consistent knowledge sources increase people's sense of understanding, then the cultural cognition model could explain the apparently nonpartisan pattern of understanding judgments. An important question is whether ideologically inconsistent knowledge sources confer a contagious sense of understanding, and if so, how that understanding influences attitudes. To our knowledge, this question has not been addressed in the literature, although the direct effects of in-group status on argument persuasiveness (e.g., Mackie et al., Reference Mackie, Worth and Asuncion1990) are consistent with a cultural cognition interpretation of our data. Finally, liberals and conservatives respond quite differently to climate policies intended to adapt to as opposed to mitigate climate change (Campbell & Kay, Reference Campbell and Kay2014). Different patterns of understanding might emerge depending on this important dimension of climate policy. Experiment 4 addresses these questions by adding measures of climate change knowledge and perceived scientist ideology and by testing both adaptation and mitigation policies.

Method

Experiment 4 varied the previous design in several ways. Community knowledge was manipulated between rather than within participants. All policy understanding items concerned climate change; we developed three adaptation and three mitigation policy scenarios based on policies examined by Bateman and O'Connor (Reference Bateman and O'Connor2016). The experiment included the survey purpose manipulation from Experiment 3, here strengthened by mentioning human-caused climate change in the climate change condition policy items only. After the main task, participants supplied demographics and then rated their overall understanding of climate change (“How well do you understand how global warming and climate change work?”, using the same scale as the understanding judgments), their support for the six policies, the appropriateness of scientists informing policy and the scientists’ ideologies (“We have discussed the policy understanding of a number of scientists and experts. In general, what sort of political views and beliefs do you think these scientists hold on average?”, using the same scale as political ideology). Participants were then instructed not to use websites or other sources and answered 10 true-or-false questions about the science of climate change adapted from studies that assessed actual knowledge (Mumpower et al., Reference Mumpower, Liu and Vedlitz2016; Shi et al., Reference Shi, Visschers, Siegrist and Arvai2016; Ranney & Clark, Reference Ranney and Clark2016); each question was followed by a confidence judgment. The experiment concluded with measures of political leaning and support for President Trump.

Results and discussion

Exclusion criteria from previous experiments were used (n = 379; M age = 36.9, SD = 11.4; 50.4% female). Trump opposition (M conservative = 2.47, M liberal = 6.66, t(377) = –4.2, p < 0.001, d = 3.05) and trust in scientists to inform policy (M conservative = 3.5, M liberal = 4.44, t(377) = 9.57, p < 0.001, d = 0.98; correlation with Trump support r = –0.509, p < 0.001) followed ideological lines, as did self-reported climate change knowledge (M conservative = 4.1, M liberal = 4.5, t(377) = –3.08, p = 0.002, d = 0.31), although assessed (actual) knowledge did not (M conservative = 66% correct, M liberal = 68% correct, t(374) = –1.3, p = 0.186, d = 0.14). Unsurprisingly, conservatives (M = 3.46) supported climate policies less than liberals did (M = 4.21, t(377) = 10.01, p < 0.001, d = 1.03); this pattern was stable across individual policies and policy types (all p-values < 0.001; see Figure 3).

As in all previous experiments, an ANOVA on mean policy understanding judgments showed a main effect of community knowledge, F(1, 371) = 36.2, p < 0.001, η p2 = 0.089, but no community knowledge by ideology interaction, F < 1, p = 0.837. A main effect of survey purpose, F(1, 371) = 5.08, p = 0.025, η p2 = 0.014, was due to higher ratings in the climate change purpose condition (M = 3.0) than in the understanding purpose condition (M = 2.7). Policy type also showed a main effect, F(1, 371) = 9.69, p = 0.002, η p2 = 0.025, with mitigation measures (M = 2.92) eliciting higher understanding ratings than did adaptation measures (M = 2.78), but policy type was not involved in two- or three-way interactions with ideology and community knowledge (F-values < 1). In sum, all climate change policies showed a nonpartisan contagious sense of understanding (see Figure 2). We speculate that Experiment 3's null result for climate change in the political condition was due to that experiment's within-participants design allowing respondents to notice the community knowledge manipulation.

We also examined whether participants’ sense of understanding of climate change itself was sensitive to the community knowledge manipulation. After all, the community of knowledge hypothesis holds that people's sense of understanding is increased by ambient awareness of community understanding, so hearing repeatedly that various natural scientists understand climate policies could boost people's sense of understanding of a critical component of climate policy: the science of climate change itself. Although reported climate change understanding was higher for liberals than for conservatives, as noted above, it was also higher in the community knowledge (M = 4.44) than in the no community knowledge condition (M = 4.16; F(1, 371) = 4.6, p = 0.033, η p2 = 0.012), and the two variables did not interact (F(1, 371) = 0.01, p = 0.956, η p2 = 0).

We next tested the predictions made by the three accounts of attitude formation using a linear regression with mean policy understanding judgments, ideological group and assessed knowledge scores as independent variables and mean policy support as the dependent variable (R 2 = 0.249), F(3, 372) = 41.13, p < 0.001. In violation of the deficit model, assessed knowledge did not predict attitudes, β = –0.046, p = 0.313. (Also against this model, assessed knowledge was unrelated to perceived knowledge, r = 0.047, p = 0.362.) Ideology did predict attitudes, β = 0.463, p < 0.001, which is expected under the cultural cognition model, but so did rated understanding, β = 0.19, p < 0.001, which is not. One issue with these analyses is that we are comparing the variance in attitudes attributable to actual understanding of climate change science with variance attributable to participants’ sense of understanding of climate change policies. This confounds actual versus perceived understanding with knowledge domain, even though the two domains overlap. We therefore reran the regression with self-reported climate change understanding replacing mean policy understanding. The results were nearly identical, showing that perceived but not actual knowledge of climate change predicts attitudes toward climate policy independently of ideology. This suggests that the cultural cognition model is incomplete; people's sense of understanding, which has been shown to vary with community knowledge, partly accounted for the explained variance in climate change policy support, over and above respondents’ ideological groups. The same pattern is seen in separate regressions for adaptation and mitigation policies; when the dummy-coded ideological group variable (liberal, conservative) is replaced by the continuous measure of political leaning (1–7 conservative to liberal scale); and when age, gender and education are added to the model. These results also remain stable when self-reported climate science understanding replaces mean policy understanding (see Supplementary Materials for regression tables).

We also tested whether participants’ beliefs about the ideological leanings of the experts that caused the contagious sense of understanding affected the results. Overall, liberals did consider these experts more liberal (M = 5.05) than conservatives did (M = 4.15; t(377) = –6.07, p < 0.001, d = 0.63). We therefore reran the main ANOVA including just those participants who responded either above the scale midpoint (i.e., liberals and conservatives who thought the experts were liberal, n = 227) or at the midpoint (liberals and conservatives who thought the experts were politically neutral, n = 75). Again, we observe the same basic pattern: a main effect of community knowledge (p-values ≤ 0.001), but no interaction with ideology (p-values ≥ 0.231). To compare these results with the findings for attitudes, we created a new variable indicating whether a given participant's ratings of the experts’ ideologies placed them outside of their own ideological group (i.e., conservatives who considered experts to be liberal and vice versa). Adding this variable to the regression models reported above did not change the pattern; sense of understanding (β = 0.19, p < 0.001) and ideology (β = 0.439, p < 0.001) still independently predicted policy support, while assessed knowledge (β = –0.042, p = 0.36) and ideological in-group status of knowledge sources (β = –0.052, p = 0.302) did not.

Finally, we examined miscalibration and attitude extremity. Following Fernbach et al. (Reference Fernbach, Light, Scott, Inbar and Rozin2019), we operationalized miscalibration as the difference between actual climate science understanding (mean-centered climate change knowledge assessment scores) from sense of climate science understanding (mean-centered ratings of general climate change understanding). We operationalized attitude extremity by subtracting the scale midpoint from policy support ratings, removing signs and averaging the resulting values to generate individual extremity scores (Fernbach et al., Reference Fernbach, Rogers, Fox and Sloman2013). A linear regression (R 2 = 0.093), F(3, 372) = 12.72, p < 0.001, showed ideology (β = 0.247, p < 0.001) and miscalibration (β = 0.151, p = 0.021) separately predicted attitude extremity, while their interaction did not (β = 0.02, p = 0.761). The same result obtains when miscalibration is operationalized as: (1) the extent to which participants were overconfident (mean confidence ratings in the knowledge assessment task minus mean performance on that measure); or (2) the gap between participants’ sense of understanding climate policy and their own knowledge of climate science (mean understanding from the policy judgment task minus performance on the knowledge assessment). In short, the gap between perceived and assessed knowledge predicted attitude extremity independently of ideology no matter how that gap was quantified.

General discussion

Five experiments show the contagious sense of understanding of public policies. People reported higher understanding of policies related to climate change and other controversial issues when they believed that experts understand how they work, a pattern previously reported for natural phenomena, causal systems that are similarly complex and difficult for individual agents to understand (Sloman & Rabb, Reference Sloman and Rabb2016). The effect was robust and impervious to experimental efforts to elicit the partisan patterns predicted by divisions in liberals’ and conservatives’ reported trust in science generally (Gauchat, Reference Gauchat2012) and climate science in particular (Hamilton et al., Reference Hamilton, Hartter and Saito2015; Funk & Kennedy, Reference Funk and Kennedy2016). The sole exception was climate change policy understanding in a context uniquely suited to expressive responding (Experiment 3). We take the broad pattern as evidence that the contagious sense of understanding is difficult to disrupt. This places understanding judgments in a rare class, since effects of partisan cues and differences associated with ideological groupings are seen in many other kinds of policy assessment (Cohen, Reference Cohen2003; Smith et al., Reference Smith, Ratliff and Nosek2012; Bolsen et al., Reference Bolsen, Druckman and Cook2014; Colombo & Kriesi, Reference Colombo and Kriesi2017; Ehret et al., Reference Ehret, Van Boven and Sherman2018; Satherley et al., Reference Satherley, Yogeeswaran, Osborne and Sibley2018).

In the case of climate change, whether society accepts or rejects the conclusions of collective wisdom is an urgent concern. We found that the contagious sense of understanding had downstream effects, predicting overall support for climate change policies independently of ideology, even though actual understanding of climate change did not. Moreover, extremity of policy support (both for and against) was predicted by the extent to which our US samples overestimated their own knowledge. The better calibrated respondents were, the more moderate were their views. An open question is whether the same would hold in other populations. Another open question is whether our operationalization of ideological group was too coarse. Although the attitudes toward climate change that cluster among hierarchical-individualists and egalitarian-communitarians show conspicuous overlap with those that cluster among conservatives and liberals, it is possible that replacing our unidimensional scales with the bidimensional measures common to cultural cognition studies would yield interactions.

What does this mean for climate policy? Actual knowledge of climate change did not vary with people's support for climate policies or location on the ideological spectrum in our sample (see also Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2014; Bedford, Reference Bedford2016), so the notion that people who disregard the scientific consensus on climate change or oppose climate action do so because they lack some necessary information was not supported. For these reasons, the deficit model cannot be the whole story; although this point is widely discussed, the model still underlies many climate science communication efforts (Suldovsky, Reference Suldovsky2017). A well-known alternative – the cultural cognition model – is supported by many findings, including some of our own. But the role of contagious sense of understanding in attitudes indicates that this model, which makes no obvious predictions about metacognitive judgments like understanding assessments except perhaps that they follow ideological lines, is also incomplete. Of course, these two models are not mutually exclusive (van der Linden et al., Reference van der Linden, Maibach, Cook, Leiserowitz, Ranney, Lewandowsky, Árvai and Weber2017). The community of knowledge hypothesis offers a means to integrate them. It gives room for a role for sense of understanding while proposing that most of the knowledge underwriting that sense is contained in the heads of members of one's community.

The community of knowledge hypothesis proposes that climate science information serves a purpose other than merely informing: it scaffolds people's impressions of collective understanding, impressions that do increase support over and above that accounted for by ideology. Impressions of collective understanding can come from other sources such as ‘science-y’ cues (Isberner et al., Reference Isberner, Richter, Maier, Knuth-Herzig, Horz and Schnotz2013; Giffin et al., Reference Giffin, Wilkenfeld and Lombrozo2017), and the effects of scientific consensus information on attitudes can also be interpreted this way (Lewandowsky et al., Reference Lewandowsky, Gignac and Vaughan2013; van der Linden et al., Reference van der Linden, Leiserowitz, Feinberg and Maibach2015). Although these effects are demonstrated in laboratory experiments where the cues are artificially varied, we emphasize that such cues are typically valid; that is, under normal circumstances, the ambient sense of understanding they create corresponds to actual scientific understanding. Of course, the fact that they can be manipulated in experiments means that they can be employed by bad faith actors, as with quasi-scientific efforts to obfuscate the dangers of smoking (see O'Connor & Weatherall, Reference Bateman and O'Connor2019). In sum, we do not suggest giving up on scientific communication. Rather, we offer a reconceptualization of its role, as a (true) indicator of collective understanding. Educational efforts are not futile on this view, but should be rethought. The goal should not be thorough individual understanding, or even critical reasoning skills, but the minimal individual understanding sufficient to accept coarse causal accounts provided by those who do have thorough understanding. Information, on this view, becomes a vehicle of trust.

Another virtue of providing information is that it can serve to show people how much they don't know. While this exercise may sound counterproductive, it does reduce attitude extremity in some cases (Fernbach et al., Reference Fernbach, Rogers, Fox and Sloman2013). Conversely, people who are least aware of how much they don't know sometimes adopt the most extreme positions (Motta et al., Reference Motta, Callaghan and Sylvester2018; van Prooijen et al., Reference van Prooijen, Krouwel and Emmer2018; Fernbach et al., Reference Fernbach, Light, Scott, Inbar and Rozin2019). This pattern was evident in our data, too: extremity of policy ratings grew larger as the gap between perceived and assessed knowledge widened. While extreme support for climate policy may seem like a happy outcome, we nonetheless see merit in better calibrated individuals’ tendency toward less polarized positions. Rather than promoting extreme action on climate change at all costs, this would support thoughtful action based on science that would take into account other pressing social needs and openness to the views of experts, even when they clash with those of one's in-group.

Admittedly, causing people to appreciate the limits of their own knowledge is a double-edged sword. While it moderates extreme attitudes in some cases, it may seem condescending if done without humility or manipulative if the recipient doubts the messenger's intent. This raises important questions about source credibility, a critical issue in the psychology of climate change that has received some attention (cf., Rabinovich et al., Reference Rabinovich, Morton and Birney2012; Lachapelle et al., Reference Lachapelle, Montpetit and Gauvin2014; Vraga et al., Reference Vraga, Myers, Kotcher, Beall and Maibach2018), but surely merits a systematic and sustained investigation. One hopeful note is that we found no evidence of ideological source effects on understanding judgments. Conservatives reported greater understanding of policies said to be understood by scientists that they considered liberal and in turn showed increased support for climate policies. Defenders of cultural cognition would be right to point out that effect sizes of ideology on attitudes dwarfed those of the contagious sense of understanding. But it is not feasible to attempt to change ideological commitments, and perhaps not ethical either. Improving calibration and increasing awareness that all complex knowledge critically depends on others constitutes a compromise approach between fruitlessly dispensing information on the assumption that it speaks for itself and capitulating to the entrenched nature of group attitudes.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/bpp.2020.40.

Financial support

This work was funded by a grant from the program on Humility and Conviction in Public Life at the University of Connecticut and the John Templeton Foundation.

Footnotes

1 These experiments were all done during Donald Trump's term in office.

2 The number of exclusions in Experiment 2 (n = 170) was unusually high because respondents were not prevented from repeatedly taking the survey due to a coding error.

3 This moderation is consistent with previous findings (Gauchat, Reference Gauchat2012).

4 Although ideological group showed a strong relationship with Trump support, the correlation was not perfect (r = 0.71, p < 0.001). To ensure that the nonpartisan pattern of understanding judgments was not driven by conservatives who oppose Trump, we reran the main analyses for this and all subsequent experiments on just those conservatives who voiced support for Trump; the pattern of results is the same.

References

Bak, H.J. (2001), ‘Education and public attitudes toward science: Implications for the “deficit model” of education and support for science and technology’, Social Science Quarterly, 82(4): 779795.CrossRefGoogle Scholar
Bateman, T.S. and O'Connor, K. (2016), ‘Felt responsibility and climate engagement: Distinguishing adaptation from mitigation’, Global Environmental Change, 41: 206215.CrossRefGoogle Scholar
Bedford, D. (2016), ‘Does climate literacy matter? A case study of US students’ level of concern about anthropogenic global warming’, Journal of Geography, 115(5): 187197.CrossRefGoogle Scholar
Bolsen, T. and Druckman, J.N. (2018), ‘Do partisanship and politicization undermine the impact of a scientific consensus message about climate change?’, Group Processes & Intergroup Relations, 21(3): 389402.CrossRefGoogle Scholar
Bolsen, T., Druckman, J.N. and Cook, F.L. (2014), ‘The influence of partisan motivated reasoning on public opinion’, Political Behavior, 36(2): 235262.CrossRefGoogle Scholar
Bromme, R., Thomm, E., and Ratermann, K. (2016), ‘Who knows? Explaining impacts on the assessment of our own knowledge and of the knowledge of experts’, Zeitschrift für Pädagogische Psychologie, 30(2–3): 97108.CrossRefGoogle Scholar
Campbell, T.H. and Kay, A.C. (2014), ‘Solution aversion: On the relation between ideology and motivated disbelief’, Journal of Personality and Social Psychology, 107(5): 809824.CrossRefGoogle ScholarPubMed
Clifford, S., Jewell, R.M. and Waggoner, P.D. (2015), ‘Are samples drawn from Mechanical Turk valid for research on political ideology?’, Research & Politics, 2(4): 2053168015622072.CrossRefGoogle Scholar
Cohen, G.L. (2003), ‘Party over policy: The dominating impact of group influence on political beliefs’, Journal of Personality and Social Psychology, 85(5): 808822.CrossRefGoogle ScholarPubMed
Colombo, C., and Kriesi, H. (2017), ‘Party, policy–or both? Partisan-biased processing of policy arguments in direct democracy’, Journal of Elections, Public Opinion and Parties, 27(3): 235253.CrossRefGoogle Scholar
Drummond, C. and Fischhoff, B. (2017), ‘Individuals with greater science literacy and education have more polarized beliefs on controversial science topics’, Proceedings of the National Academy of Sciences, 114(36): 95879592.CrossRefGoogle ScholarPubMed
Egan, P.J. and Mullin, M. (2017), ‘Climate change: US public opinion’, Annual Review of Political Science, 20(1): 209227.CrossRefGoogle Scholar
Ehret, P.J., Van Boven, L. and Sherman, D.K. (2018), ‘Partisan barriers to bipartisanship: Understanding climate policy polarization’, Social Psychological and Personality Science, 9(3): 308318.CrossRefGoogle Scholar
Fernbach, P.M., Light, N., Scott, S.E., Inbar, Y. and Rozin, P. (2019), ‘Extreme opponents of genetically modified foods know the least but think they know the most’, Nature Human Behaviour, 3(3): 251256.CrossRefGoogle ScholarPubMed
Fernbach, P.M., Rogers, T., Fox, C.R. and Sloman, S.A. (2013), ‘Political extremism is supported by an illusion of understanding’, Psychological Science, 24(6): 939946.CrossRefGoogle ScholarPubMed
Funk, C., & Kennedy, B. (2016). The politics of climate. Retrieved from http://www.pewinternet.org/2016/10/04/the-politics-of-climateGoogle Scholar
Gauchat, G. (2012), ‘Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010’, American Sociological Review, 77(2): 167187.CrossRefGoogle Scholar
Gaviria, C., Corredor, J. A., and Zuluaga-Rendón, Z. (2017), “If it matters, I can explain it”: Social desirability of knowledge increases the illusion of explanatory depth. Paper presented at the Annual Conference of the Cognitive Science Society.Google Scholar
Giffin, C., Wilkenfeld, D., and Lombrozo, T. (2017), ‘The explanatory effect of a label: Explanations with named categories are more satisfying’, Cognition, 168: 357369.CrossRefGoogle ScholarPubMed
Gilens, M., and Murakawa, N. (2002), ‘Elite cues and political decision-making’, Research in Micropolitics, 6: 1549.Google Scholar
Graham, M.H. (2018), ‘Self-awareness of political knowledge’, Political Behavior: 122.Google Scholar
Guber, D.L. (2013), ‘A cooling climate for change? Party polarization and the politics of global warming’, American Behavioral Scientist, 57(1): 93115.CrossRefGoogle Scholar
Hamilton, L.C. (2015), ‘Polar facts in the age of polarization’, Polar Geography, 38(2): 89106.CrossRefGoogle Scholar
Hamilton, L.C. (2018), ‘Self-assessed understanding of climate change’, Climatic Change, 151(2): 349362.CrossRefGoogle Scholar
Hamilton, L.C., Hartter, J. and Saito, K. (2015), ‘Trust in scientists on climate change and vaccines’, Sage Open, 5(3): 2158244015602752.CrossRefGoogle Scholar
Hamlin, A. and Jennings, C. (2011), ‘Expressive political behaviour: Foundations, scope and implications’, British Journal of Political Science, 41(3): 645670.CrossRefGoogle Scholar
Hardwig, J. (1985), ‘Epistemic dependence’, The Journal of Philosophy, 82(7): 335349.CrossRefGoogle Scholar
Hornsey, M.J., Harris, E.A., Bain, P.G. and Fielding, K.S. (2016), ‘Meta-analyses of the determinants and outcomes of belief in climate change’, Nature Climate Change, 6(6): 622626.CrossRefGoogle Scholar
Hornsey, M.J., Harris, E.A. and Fielding, K.S. (2018), ‘Relationships among conspiratorial beliefs, conservatism and climate scepticism across nations’, Nature Climate Change, 8(7): 614620.CrossRefGoogle Scholar
Isberner, M. B., Richter, T., Maier, J., Knuth-Herzig, K., Horz, H., and Schnotz, W. (2013), ‘Comprehending conflicting science-related texts: Graphs as plausibility cues’, Instructional Science, 41(5): 849872.CrossRefGoogle Scholar
Iyengar, S. and Westwood, S.J. (2015), ‘Fear and loathing across party lines: New evidence on group polarization’, American Journal of Political Science, 59(3): 690707.CrossRefGoogle Scholar
Kahan, D.M., and Braman, D. (2006), ‘Cultural cognition and public policy’, Yale Law & Policy Review, 24: 149172.Google Scholar
Kahan, D.M., Jenkins-Smith, H. and Braman, D. (2011), ‘Cultural cognition of scientific consensus’, Journal of Risk Research, 14(2): 147174.CrossRefGoogle Scholar
Kahan, D.M., Peters, E., Dawson, E.C. and Slovic, P. (2017), ‘Motivated numeracy and enlightened self-government’, Behavioural Public Policy, 1(1): 5486.CrossRefGoogle Scholar
Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. and Mandel, G. (2012), ‘The polarizing impact of science literacy and numeracy on perceived climate change risks’, Nature Climate Change, 2(10): 732735.CrossRefGoogle Scholar
Lachapelle, E., Montpetit, É. and Gauvin, J.P. (2014), ‘Public perceptions of expert credibility on policy issues: The role of expert framing and political worldviews’, Policy Studies Journal, 42(4): 674697.CrossRefGoogle Scholar
Lau, R. R., and Redlawsk, D. P. (2001), ‘Advantages and disadvantages of cognitive heuristics in political decision making’, American Journal of Political Science, 45(4): 951971.CrossRefGoogle Scholar
Leiserowitz, A., Maibach, E., Roser-Renouf, C., Rosenthal, S., Cutler, M., and Kotcher, J. (2018), Climate change in the American mind: March 2018. Yale University and George Mason University. New Haven, CT: Yale Program on Climate Change Communication.Google Scholar
Lewandowsky, S., Gignac, G.E. and Vaughan, S. (2013), ‘The pivotal role of perceived scientific consensus in acceptance of science’, Nature Climate Change, 3(4): 399404.CrossRefGoogle Scholar
Lupia, A., and McCubbins, M. D. (1998), The democratic dilemma, Cambridge University Press.Google Scholar
Mackie, D. M., Worth, L. T., and Asuncion, A. G. (1990), ‘Processing of persuasive in-group messages’, Journal of Personality and Social Psychology, 58(5): 812822.CrossRefGoogle ScholarPubMed
Marks, J., Copland, E., Loh, E., Sunstein, C.R. and Sharot, T. (2019), ‘Epistemic spillovers: Learning others’ political views reduces the ability to assess and use their expertise in nonpolitical domains’, Cognition, 188(1): 7484.CrossRefGoogle ScholarPubMed
Marquart-Pyatt, S.T., McCright, A.M., Dietz, T. and Dunlap, R.E. (2014), ‘Politics eclipses climate extremes for climate change perceptions’, Global Environmental Change, 29(1): 246257.CrossRefGoogle Scholar
McCright, A.M., Dunlap, R.E. and Xiao, C. (2013), ‘Perceived scientific agreement and support for government action on climate change in the USA’, Climatic Change, 119(2): 511518.CrossRefGoogle Scholar
McPhetres, J., Rutjens, B.T., Weinstein, N. and Brisson, J.A. (2019), ‘Modifying attitudes about modified foods: increased knowledge leads to more positive attitudes’, Journal of Environmental Psychology, 64(1): 2129.CrossRefGoogle Scholar
Motta, M., Callaghan, T. and Sylvester, S. (2018), ‘Knowing less but presuming more: Dunning-Kruger effects and the endorsement of anti-vaccine policy attitudes’, Social Science & Medicine, 211(1): 274281.CrossRefGoogle ScholarPubMed
Mumpower, J.L., Liu, X. and Vedlitz, A. (2016), ‘Predictors of the perceived risk of climate change and preferred resource levels for climate change management programs’, Journal of Risk Research, 19(6): 798809.CrossRefGoogle Scholar
Nicholson, S.P., Coe, C.M., Emory, J. and Song, A.V. (2016), ‘The politics of beauty: The effects of partisan bias on physical attractiveness’, Political Behavior, 38(4): 883898.CrossRefGoogle Scholar
O'Connor, C., and Weatherall, J. O. (2019), The misinformation age: How false beliefs spread, Yale University Press.CrossRefGoogle Scholar
Ortoleva, P. and Snowberg, E. (2015), ‘Overconfidence in political behavior’, American Economic Review, 105(2): 504–35.CrossRefGoogle Scholar
Poortinga, W., Whitmarsh, L., Steg, L., Böhm, G., and Fisher, S. (2019), ‘Climate change perceptions and their individual level determinants: A cross-European analysis’, Global Environmental Change, 55(1): 2535.CrossRefGoogle Scholar
Rabb, N., Fernbach, P., and Sloman, S. (2019), ‘Individual representation in a community of knowledge’, Trends in Cognitive Sciences, 23(10): 891902.CrossRefGoogle Scholar
Rabinovich, A., Morton, T.A. and Birney, M.E. (2012), ‘Communicating climate science: The role of perceived communicator's motives’, Journal of Environmental Psychology, 32(1): 1118.CrossRefGoogle Scholar
Ramirez, M.D. and Erickson, N. (2014), ‘Partisan bias and information discounting in economic judgments’, Political Psychology, 35(3): 401415.CrossRefGoogle Scholar
Ranney, M.A. and Clark, D. (2016), ‘Climate change conceptual change: Scientific information can transform attitudes’, Topics in Cognitive Science, 8(1): 4975.CrossRefGoogle ScholarPubMed
Rhodes, E., Axsen, J. and Jaccard, M. (2014), ‘Does effective climate policy require well-informed citizen support?’, Global Environmental Change, 29(1): 92104.CrossRefGoogle Scholar
Rollwage, M., Dolan, R.J. and Fleming, S.M. (2018), ‘Metacognitive failure as a feature of those holding radical beliefs’, Current Biology, 28(24): 40144021.CrossRefGoogle ScholarPubMed
Rozenblit, L. and Keil, F. (2002), ‘The misunderstood limits of folk science: An illusion of explanatory depth’, Cognitive Science, 26(5): 521562.CrossRefGoogle Scholar
Rutjens, B.T., Sutton, R.M. and van der Lee, R. (2018), ‘Not all skepticism is equal: Exploring the ideological antecedents of science acceptance and rejection’, Personality and Social Psychology Bulletin, 44(3): 384405.CrossRefGoogle ScholarPubMed
Satherley, N., Yogeeswaran, K., Osborne, D. and Sibley, C.G. (2018), ‘If they say “yes,” we say “no”: Partisan cues increase polarization over national symbols’, Psychological Science, 29(12): 19962009.CrossRefGoogle Scholar
Shi, J., Visschers, V.H. and Siegrist, M. (2015), ‘Public perception of climate change: The importance of knowledge and cultural worldviews’, Risk Analysis, 35(12): 21832201.CrossRefGoogle ScholarPubMed
Shi, J., Visschers, V.H., Siegrist, M. and Arvai, J. (2016), ‘Knowledge as a driver of public perceptions about climate change reassessed’, Nature Climate Change, 6(8): 759762.CrossRefGoogle Scholar
Sloman, S. and Fernbach, P. (2018), The knowledge illusion: Why we never think alone, Riverhead.Google Scholar
Sloman, S.A. and Rabb, N. (2016), ‘Your understanding is my understanding: Evidence for a community of knowledge’, Psychological Science, 27(11): 14511460.CrossRefGoogle ScholarPubMed
Smith, E.K. and Mayer, A. (2019), ‘Anomalous Anglophones? Contours of free market ideology, political polarization, and climate change attitudes in English-speaking countries, Western European and post-Communist states’, Climatic Change, 152(1): 1734.CrossRefGoogle Scholar
Smith, C. T., Ratliff, K. A., and Nosek, B. A. (2012), ‘Rapid assimilation: Automatically integrating new information with existing beliefs’, Social Cognition, 30(2): 199219.CrossRefGoogle Scholar
Stone, D.F. (2019), ‘“Unmotivated bias” and partisan hostility: Empirical evidence’, Journal of Behavioral and Experimental Economics, 79: 1226.CrossRefGoogle Scholar
Stoutenborough, J.W. and Vedlitz, A. (2014), ‘The effect of perceived and assessed knowledge of climate change on public policy concerns: An empirical comparison’, Environmental Science & Policy, 37(1): 2333.CrossRefGoogle Scholar
Stoutenborough, J.W. and Vedlitz, A. (2016), ‘The role of scientific knowledge in the public's perceptions of energy technology risks’, Energy Policy, 96(1): 206216.CrossRefGoogle Scholar
Suldovsky, B. (2017), The information deficit model and climate change communication. In Oxford research encyclopedia of climate science.CrossRefGoogle Scholar
Sundblad, E.L., Biel, A. and Gärling, T. (2009), ‘Knowledge and confidence in knowledge about climate change among experts, journalists, politicians, and laypersons’, Environment and Behavior, 41(2): 281302.CrossRefGoogle Scholar
Tobler, C., Visschers, V.H. and Siegrist, M. (2012), ‘Consumers’ knowledge about climate change’, Climatic Change, 114(2): 189209.CrossRefGoogle Scholar
Tranter, B. and Booth, K. (2015), ‘Scepticism in a changing climate: A cross-national study’, Global Environmental Change, 33(1): 154164.CrossRefGoogle Scholar
Unsworth, K.L. and Fielding, K.S. (2014), ‘It's political: How the salience of one's political identity changes climate change beliefs and policy support’, Global Environmental Change, 27(1): 131137.CrossRefGoogle Scholar
van der Linden, S.L., Leiserowitz, A.A., Feinberg, G.D. and Maibach, E.W. (2015), ‘The scientific consensus on climate change as a gateway belief: Experimental evidence’, PloS one, 10(2): p.e0118489.CrossRefGoogle ScholarPubMed
van der Linden, S., Maibach, E., Cook, J., Leiserowitz, A., Ranney, M., Lewandowsky, S., Árvai, J. and Weber, E.U. (2017), ‘Culture versus cognition is a false dilemma’, Nature Climate Change, 7(7): 457.CrossRefGoogle Scholar
van Prooijen, J.W., Krouwel, A.P. and Emmer, J. (2018), ‘Ideological responses to the EU refugee crisis: The left, the right, and the extremes’, Social Psychological and Personality Science, 9(2): 143150.CrossRefGoogle ScholarPubMed
Vitriol, J.A. and Marsh, J.K. (2018), ‘The illusion of explanatory depth and endorsement of conspiracy beliefs’, European Journal of Social Psychology, 48(7): 955969.CrossRefGoogle Scholar
Voelkel, J.G., Brandt, M.J. and Colombo, M. (2018), ‘I know that I know nothing: Can puncturing the illusion of explanatory depth overcome the relationship between attitudinal dissimilarity and prejudice?’, Comprehensive Results in Social Psychology, 3(1): 5678.CrossRefGoogle Scholar
Vraga, E., Myers, T., Kotcher, J., Beall, L., & Maibach, E. (2018), ‘Scientific risk communication about controversial issues influences public perceptions of scientists’ political orientations and credibility’, Royal Society Open Science, 5(2): 170505.CrossRefGoogle ScholarPubMed
Zaller, J.R. (1992), The Nature and Origins of Mass Opinion, Cambridge University Press.CrossRefGoogle Scholar
Zeveney, A. and Marsh, J. (2016), The illusion of explanatory depth in a misunderstood field: The IOED in mental disorders. Paper presented at the Annual Conference of the Cognitive Science Society.Google Scholar
Figure 0

Figure 1. Three views of the determinants of attitudes in scientifically informed policy domains. (I) The deficit model predicts that actual understanding will determine both one's sense of understanding and one's attitudes. (II) The cultural cognition model predicts that group attitudes will determine individual attitudes (and arguably one's sense of understanding, although it makes no explicit predictions about metacognition). The model posits a variety of information-processing mechanisms (motivated reasoning, affective weighting, etc.). These are subsumed by the causal relation shown because they require an a priori conclusion (in this case, the attitudes of the group) to guide reasoning. (III) The community of knowledge hypothesis predicts that individual attitudes will be sensitive to both group attitudes (since the group possesses most of the knowledge) and one's sense of understanding (which is partly determined by group knowledge). These two factors are often aligned, but they diverge for some groups in controversial domains like climate change. On this model, individual knowledge is subsumed by group knowledge because of their interdependence.

Figure 1

Table 1. Predictions for three models tested by Experiments 1–4 along with broad characterization of findings.

Figure 2

Table 2. Example items. Italicized text indicates phrases that were experimentally manipulated for the expert type and community knowledge factors. See Supplementary Materials for complete items.

Figure 3

Figure 2. Mean understanding judgments of public policies when experts ostensibly do (dark) or do not (light) understand the policies. Top row: mean understanding by ideological group in all experiments. Middle row (Experiment 3): mean understanding of policies concerning hot-button issues when the stated purpose of the experiment was “human-caused climate change” (left) or “complex objects or events” (right) collapsed over ideological group. Bottom row (Experiment 4): mean understanding of climate change-related policies under the same survey purpose conditions collapsed over ideological group. All comparisons p < 0.05 except climate change/climate change condition, p = 0.239. Error bars show SEM.

Figure 4

Figure 3. Support for political policies in Experiments 3 and 4 by ideological group (light = liberal, dark = conservative). All comparisons p < 0.05. Error bars show SEM.

Supplementary material: File

Rabb et al. supplementary material

Rabb et al. supplementary material

Download Rabb et al. supplementary material(File)
File 67 KB
You have Access
3
Cited by

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

How others drive our sense of understanding of policies
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

How others drive our sense of understanding of policies
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

How others drive our sense of understanding of policies
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *