Science denialism permeates society. Though adamant anti-vaxxers and resolute flat Earthers may be small in numbers, many more people in the United States deny climate change and/or evolution (at least 50% and 33%, respectively1). And while scientists face public denial of well-supported theories, popular culture celebrates pseudoscience: Olympic athletes engage in cupping,Reference Carter2 “gluten-free” is trending (even among those without disorders like celiac diseaseReference Doheny3), and unsubstantiated alternative medicine methods flourish with support from cultural icons like Oprah.Reference Gorski4 Governments face furious opposition to fluoridated water (when it was added to prevent tooth decay5), and popular restaurant chains, like Chipotle, proudly tout their opposition to genetically modified organisms (GMOs) (see https://www.chipotle.com/gmo; scientists stress that the focus should be on the risks and benefits of each specific product and not globally accepted or rejected based on the processes used to make them).Reference Tagliabue6
Moreover, the emergence of social media has provided a broad forum for the famous, not famous, and infamous alike to share and crowdsource opinions and even target misinformation to those who are most vulnerable.Reference Valdez7 This allows so-called fake news to go viral.Reference Kata8 Yet who is most susceptible to denying science and/or believing misinformation? In the current study, we consider the extent to which conspiracy mentality leads people to (a) reject well-supported scientific theories and (b) accept viral and deceptive claims (commonly referred to as fake news) about science, two ways in which publics disagree with scientists.
Scientists versus publics
Why are there such gaps between what scientists have shown and what lay publics believe? One of the original models attempting to answer this question, the public deficit model,Reference Bauer, Allum and Miller9 poses that science denialism is fueled by a lack of science knowledge. In other words, if people simply understood the science, then they would accept the science. This model, however, oversimplifies a complex problem: despite the modest gains in acceptance that occur with scientific literacy, the relationship is often conditional on individuals’ prior beliefs, attitudes, values, and worldviews (e.g., their “priors”; note that we are using the term “priors” colloquially—we do not intend to refer to Bayesian priors).10 While greater scientific knowledge can increase the likelihood of accepting scientific results for some, it can increase the likelihood of rejecting those results for others—the opposite of what the deficit model envisages. For example, compared to Republicans with less science knowledge, Republicans with greater science knowledge are even more likely to reject climate change.Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel11 In such cases, people likely are using their knowledge and reasoning abilities to be even better at conforming evidence to fit their existing schemas,Reference Kahan, Peters, Dawson and Slovic12 a process that is part of a cluster of phenomena commonly referred to as motivated reasoning.Reference Kunda13
Conspiracy theorizing: A method of motivated reasoning
Concocting and/or endorsing conspiracy theories—that is, conspiracy theorizing—can function as a method of motivated reasoning.Reference Douglas, Sutton and Cichocka14 Conspiracy theories are explanations for situations that involve powerful agents secretly working together to pursue hidden goals that are usually unlawful or malevolent.Reference Clarke15 Although believing conspiracies is often discussed in the literature as a pathological behavior,Reference Abalakina-Paap, Stephan, Craig and Gregory16, Reference Green and Douglas17 such conspiracy theorizing can be normal.Reference Aupers18, Reference Butler, Koopman and Zimbardo19, Reference Jensen20, Reference Hofstadter21 That is, anyone might believe a conspiracy theory under the right set of circumstances. For example, Radnitz and UnderwoodReference Radnitz and Underwood22 exposed participants in an experiment to a fictional vignette that was interpreted as a conspiracy theory conditional on participants’ political views. Although the vignette contained no explicit partisan content, participants could extrapolate political cues from whether the “villain” of the vignette was described as a government institution (conservative partisan cue) or a corporate one (liberal partisan cue). As expected, when the vignette implicated a corporation, liberals were more likely to perceive a conspiracy, and when the government was implicated, conservatives were more likely to perceive a conspiracy.Reference Radnitz and Underwood22
A signature feature of conspiracy theorizing is impugning experts, elites, or other authorities or powerful institutions with corrupt motives. This is also true of science-relevant conspiracies. Some skeptics of GMOs, for instance, dismiss proponents of agricultural biotechnology as “Monsanto shills,”Reference Haspel23 and some vaccine skeptics depict vaccine advocates as “poisoning children to benefit Big Pharma.”Reference Blaskiewicz24 Similarly, some describe climate change as a conspiracy among scientists to sustain grant fundingReference Lewandowsky, Oberauer and Gignac25 or as a left-wing conspiracy to harm the U.S. economy.Reference Sussman26, Reference Uscinski and Olivella27 Questioning authorities’ motivations is rooted in heuristic processing.Reference Petty and Cacioppo28 Without having adequate knowledge to evaluate the scientific claims, nonexpert publics instead tend to evaluate the competence and motivations of expert communicators.Reference Landrum, Eaves and Shafto29 By impugning these experts with self-serving or even malevolent motivations,Reference Kahan, Jenkins-Smith and Braman30, Reference Miller, Saunders and Farhart31 individuals may justify their rejection of otherwise credible scientific evidence and resolve any cognitive dissonance.
Conspiracy mentality: A political worldview
Although there is evidence than anyone can conspiracy theorize under the right circumstances (e.g., conditional conspiracy thinkingReference Uscinski and Olivella27), there still may be a unique worldview captured by broad endorsement of conspiracy theories, or conspiracy mentality. Reference Bruder, Haffke, Neave, Nouripanah and Imhoff32, Reference Darwin, Neave and Holmes33 Conspiracy mentality (which is also sometimes called conspiracy ideationReference Lewandowsky, Oberauer and Gignac25) has been described as a political worldview consisting of general feelings of distrust or paranoia toward government services and institutions, feelings of political powerlessness and cynicism, and a general defiance of authority.Reference Hofstadter21 Traditionally, this construct has been measured by asking individuals to evaluate a collection of unrelated conspiracy theories (e.g., alternative explanations surrounding the death of Princess Diana, the origins of diseases such as HIV, and the “truth” behind the September 11 attacks on the World Trade Center) and/or generic ones (e.g., “I think that many very important things happen in the world which the public is never informed about”) and measuring either total or average agreement that the claims are likely to be true.Reference Uscinski and Olivella27, Reference Darwin, Neave and Holmes33, Reference Bruder and Manstead34 Researchers have found that the best predictor of belief in one conspiracy often is belief in other conspiracies,Reference Wood, Douglas and Sutton35, Reference Swami, Coles, Stieger, Pietsching, Furnham, Rehim and Voracek36 lending support to the view that conspiracy mentality acts as a generalized political attitude or worldview.Reference Imhoff and Bruder37, Reference Koerth-Baker38 Conspiracy mentality has been associated with a series of traits and worldviews such as an active imagination,Reference Swami, Chamorro‐Premuzic and Furnham39 paranoia and schizotypal tendencies,Reference Bruder, Haffke, Neave, Nouripanah and Imhoff32, Reference Darwin, Neave and Holmes33 high levels of anomie, and low levels of self-esteem.Reference Abalakina-Paap, Stephan, Craig and Gregory16, Reference Goertzel40
As discussed earlier, conspiracy theories can be involved in disagreements between scientists and publics at two levels that are not mutually exclusive: (1) as a method of motivated reasoning—doubting the communicator’s credibility and suggesting a conspiracy justifies rejecting otherwise credible scientific evidence (i.e., conspiracy theorizing), and (2) as a monological belief system or political worldview in which subscribing individuals find all authorities and institutions, including scientific ones, inherently deceitful (i.e., conspiracy mentality). Our first aim focuses primarily on the latter, while our second aim focuses on both.
First, we aim to determine whether and, if so, to what extent conspiracy mentality predicts the rejection of well-supported scientific theories (i.e., anthropogenic climate change and human evolution) above and beyond the more well-studied priors of science literacy, political ideology, and religiosity. Second, we aim to examine to what extent conspiracy mentality and the aforementioned priors predict acceptance of inaccurate, deceptive, and, in some cases, conspiratorial science claims (i.e., viral deception about science). That is, who is most susceptible to accepting this type of viral deception about science?
To examine these questions, we analyze data from two samples collected as part of a broader study on alternative beliefs. The first sample consists of 513 individuals requested to be recruited to match census data by Research Now/SSI, an online digital data collection company. Because endorsement of conspiracies is often low in nationally representative populations, we felt that it was also important to actively recruit individuals who would be more likely to have higher conspiracy mentality scores. To that end, we recruited 21 individuals in person at the first annual Flat Earth International Conference to take our survey. Although these individuals were recruited in person, they completed the survey in the same online format as the sample from Research Now.
Aim 1: Examining who rejects well-supported scientific theories
Regarding our first aim, a handful of other researchers have examined potential links between conspiracy theorizing and science denial, specifically in the domain of climate change. Lewandowsky and colleagues surveyed climate-blog visitorsReference Lewandowsky, Oberauer and Gignac25 and an online panelReference Lewandowsky, Gignac and Oberauer41 and found that conspiracy mentality predicted the rejection of climate change and other sciences. Yet their findings have been challenged by others who state that the conclusions are not supported by the dataReference Dixon and Jones42 (also see Lewandowsky and colleagues’ response to this challengeReference Lewandowsky, Gignac and Oberauer43). Similarly, Uscinski and OlivellaReference Uscinski and Olivella27 found that the relationship between conspiracy and climate change attitudes is much stronger than previously suggested and that is contingent on people’s political party affiliation and, thus, non-monotonic. We seek to examine this question and offer the following hypotheses:
H1a: Conspiracy mentality will predict rejection of well-supported scientific theories.
H1b: Conspiracy mentality will predict rejection of well-supported scientific theories conditional on political party affiliation.
Aim 2: Examining who accepts viral deception about science
Regarding our second aim, we examine participants’ evaluations of viral deception (or fake news) relevant to science. Fake news has been defined as bogus or fabricated information that resembles news media content but does not adhere to journalistic norms and is promoted on social media.Reference Lazer, Baum, Benkler, Berinsky, Greenhill, Menczer and Metzger44 Kathleen Hall Jamieson, Elizabeth Ware Packard Professor of Communication at the University of Pennsylvania and director of the Annenberg Public Policy Center, argues that we should not use the term “fake news”; instead, we should use the term “viral deception,” with the acronym VD, to purposefully associate it with venereal disease. During an interview with Brian Stelter on CNN’s Reliable Sources, Jamieson explained this:
We don’t want to get venereal disease. If you find someone who’s got it, you want to quarantine them and cure them. You don’t want to transmit it. By virtue of saying “fake news,” we ask the question, well, what is real news—and you invite people to label everything they disapprove of “fake news.” As a result, it’s not a useful concept. What are we really concerned about? Deception. And deception of a certain sort that goes viral.45
Therefore, in this study, we use the term “viral deception” in place of “fake news” and refer to viral deception about science specifically. We define viral deception about science as bogus or fabricated science, or science-relevant, claims from sources known for spreading misinformation and propaganda on social media, such as NaturalNews.com, RT.com, FoodBabe.com). The stories on these sites frequently explain away noncongenial scientific evidence by assigning corrupt motives to scientific authorities, research organizations, and regulatory agencies.
Not everyone will be susceptible to viral deception about science when exposed. The differential susceptibility to media effects model,Reference Valkenburg and Peter46 for example, proposes individual difference variables can moderate or modify the direction or strength of media use effects. Because many of the claims made are conspiracy oriented—that is, they offer a narrative that helps individuals dismiss credible evidence by impugning the experts with malicious motives—we hypothesize the following:
H2: Conspiracy mentality will predict evaluating viral deception about science as likely to be true.
However, other individual differences might also contribute to susceptibility to viral deception about science. For instance, recent work looking at the susceptibility of individuals to fake political news found that lower cognitive reflectionReference Pennycook and Rand47 and religious fundamentalismReference Bronstein, Pennycook, Bear, Rand and Cannon48 predicted susceptibility among some other reasoning-type measures (e.g., dogmatism). Therefore, we examine whether and, if so, to what extent science literacy (which includes the reasoning-type measures of cognitive reflection and numeracy), political affiliation, and religiosity predict evaluating deceptive science claims as likely to be true. We hypothesize the following:
H3a-b: Science literacy (a) and religiosity (b) will predict evaluating viral deception about science as likely to be true.
Moreover, given that previous literature showing conditional effects of science knowledge and conspiracy theorizing by political party, we also include these interactions in our analysis.
Samples and Data Collection
Participants for the first sample were part of an online national consumer panel recruited by Research Now/SSI (referred to as the national sample) and were surveyed during the fall of 2017. To compensate panel participants, Research Now/SSI uses an incentive scale based on the length of the survey and the panelists’ profiles. Panel participants who are considered “time-poor/money-rich” are paid significantly higher incentives per completed survey than the average panelist so that participating is attractive enough to be perceived as worth the time investment. The incentive options allow panelists to redeem from a range of options such as gift cards, point programs, and partner products and services.
We requested 500 participants, sampled to be approximately nationally representative based on census numbers. Anticipating about a 50% completion rate, Research Now/SSI sampled over 1,000 participants on their online consumer panel, and we paid the company for the participants who qualified as “complete.” To qualify as complete, participants had to (a) be at least 18 years old, (b) reside in the United States, (c) correctly answer the attention check item (i.e., “if you are reading this, choose ‘likely false’”), (d) finish and submit the survey (participants could still submit the survey without having answered questions they preferred to skip), and (e) have taken at least 5 minutes to complete the approximately 20-minute survey.
Our sample of participants (N = 513) was 56% female and ranged from 18 to 80 years old (M = 48.98, Median = 50, SD = 14.97). About 5% of participants reported being black or African American, 5% Asian or Asian American, and 11.5% Hispanic/Latinx. The median level of education attained was an associate’s degree (coded to equal 14 years of school), with an average of 15.49 years of schooling (SD = 3.13).
The second sample consisted of 21 individuals who attended the Flat Earth International Conference in Raleigh, North Carolina, in November 2017 (referred to as the FE sample). About 60 conference attendees provided us with their email addresses, so that we could send them the link to the survey, but only 21 individuals began and submitted the survey. Participants who submitted the survey received a $5 Amazon gift card. Of these individuals, nine were male, seven were female, and five declined to provide their gender. Most of the participants were white, but one reported being black or African American, one reported being Hispanic/Latinx, and three declined to report their race/ethnicity. Regarding the highest level of education attained, two reported high school, four reported some college, one reported having a two-year degree (e.g., associate’s degree), seven reported having bachelor’s degrees, two reported having graduate degrees, and five declined to report their education. The average age of this group was 38.62 years (Median = 36.5, SD = 12.91), though five declined to provide information on age.
All participants were emailed a link to the survey, which was hosted on Qualtrics.com. Participants completed several measures and those used in this study are described next. For more information on the survey, including a full list of questions asked, please see our project page on the Open Science Framework at https://osf.io/9x5gm/.
Well-supported scientific theories.
We asked participants whether climate change is real and human caused (response options: a. true, b. false because it is not human caused, c. false because it is not happening, or d. prefer not to answer) and whether humans evolved from earlier species of animal (response options: a. true, b. false, or c. prefer not to answer). Items were coded so that responses that align with scientific consensus (i.e., true) were scored as 0, or accepting the fact, and those against the consensus (i.e., false or prefer not to answer) were scored as 1, or as rejecting the fact. These items were embedded in the science literacy section of the survey. A greater proportion of the FE sample rejected anthropogenic climate change and human evolution than in the national sample. All 21 of the participants in the FE sample rejected anthropogenic climate change, whereas only 36% of the national sample did, χ2(1) = 34.26, p < .001. Moreover, all 21 of the FE sample rejected human evolution compared to only 37% of the national sample, χ2(1) = 33.73, p < .001.
Viral deception about science.
In addition to rejection of well-supported scientific theories, we also measured whether participants evaluated viral deception about science (i.e., inaccurate and misleading claims from social media about GMOs, a cure for cancer, the Zika virus, and vaccination) as likely to be true or false. For each of these items, participants were asked whether they thought the statement was definitely true (4), likely true (3), likely false (2) or definitely false (1). These items were embedded in the “beliefs” section of the survey, which also included the conspiracy theory items. Each of the statements used for this study come from deceptive claims made by viral campaigns, typically from NaturalNews.com or other dubious websites. Two of these claims featured a conspiracy, and two made inaccurate causal claims.
VD claims of conspiracy. One item stated that “a cure for most types of cancer has already been found, but medical circles prefer to keep getting research funding from governments and keep their findings secret.” There are many myths surrounding cancer,Reference Childs49 and this one in particular combines the myth that there is a miracle cure for cancer out there and the myth that researchers, particularly those at pharmaceutical companies and government agencies, are suppressing it. The FE sample (M = 3.42, Median = likely true, SD = 0.61) more strongly endorsed this claim as true than the national sample (M = 2.09, Median = likely false, SD = 1), t(21.95) = 9.12, p < .001, Cohen’s d = 2.13, 95% CI [1.65, 2.61].
A second item stated that “agricultural biotechnology companies like Monsanto are trying to cover up the fact that genetically modified organisms (GMOs) cause cancer.” This item comes from the website thetruthaboutcancer.com, in which Jeffrey Smith, a self-described expert on genetically modified foods, charges Monsanto with covering up the “fact” that there are “two deadly poisonous ingredients found in GMOs based on proven research that causes [sic] cancerous tumors to form in rats.”Reference Bronstein, Pennycook, Bear, Rand and Cannon48 This article, which includes a video, has been shared over 31,700 times on social media. Similarly, Vani Hari, who is known as the “Food Babe,” accused Monsanto of conspiring with the Environmental Protection Agency to bury evidence that its weed killer causes cancer.Reference Childs49 That GMOs cause cancer is also “fake news”: a review by the National Academies of Sciences, Engineering, and Medicine found “no substantiated evidence of a difference in risks to human health between currently commercialized genetically engineered (GE) crops and conventionally bred crops,”50 and the Society of Toxicology51 reported that “data to date have identified no evidence of adverse health effects from commercially available GE crops or the foods obtained by them.” The FE sample (M = 3.40, Median = likely true, SD = 0.82) more strongly endorsed this headline than the national sample (M = 2.56, Median = likely true, SD = 0.87), t(22.18) = 6.01, p < .001, Cohen’s d = 1.37, 95% CI [0.92, 1.83].
VD claims about causation. In addition to the two viral and deceptive claims about science conspiracies, we examined two inaccurate causal claims. Our third item stated that “the Zika virus was caused by the genetically modified mosquito.” This claim comes from a 2016 article posted on NaturalNews.com,Reference Adams52 which can be traced back to an article posted on RT.com.53 This theory of how Zika came about is inaccurate: FactCheck.org debunked the claim one month after it first appeared.Reference Schipani54 The FE sample (M = 2.88, Median = likely true, SD = 0.72) more strongly endorsed this headline than the national sample (M = 2.09, Median = likely false, SD = 0.87), t(16.52) = 4.27, p < .001, Cohen’s d = 1.09, 95% CI [0.58, 1.59].
Lastly, we asked participants about the claim that “childhood vaccinations are unsafe and cause disorders like autism.” Despite being debunked overReference DeStefano55 and over,Reference Kalkbrenner, Schmidt and Penlesky56 many deceptive sites, including NaturalNews.com,Reference Adams57, Reference Lilley58 continue to propagate this misinformation. The FE sample (M = 2.88, Median = likely true, SD = 0.72) more strongly endorsed this claim than the national sample (M = 2.09, Median = likely false, SD = 0.87), t(19.33) = 9.02, p < .001, Cohen’s d = 2.11, 95% CI [1.63, 2.59].
Science literacy was measured using a shortened version of the Ordinary Science Intelligence (OSI 2.0) scale.Reference Kahan59 Our shortened version of the OSI included six items that were chosen based on their difficulty and discriminatory power from a previous item response theory analysis with a nationally representative population. Items were scored so that correct answers received 1 point and incorrect answers (and no response) received 0 points. On average, participants answered 2.54 questions out of 6 correctly (FE sample: M = 2.24, Median = 3; national sample: M = 2.56, Median = 2). Consistent with prior research, the scale was evaluated and scored using item response theory (a 2PL model). Then, scores were centered so that the mean was 0 (SD = 0.75); the scores ranged from –1.26 to 1.35. There was no significant difference between the mean science literacy scores of the two samples, t(21.05) = 0.40, p = .696, Cohen’s d = 0.09, 95% CI [–0.35, 0.53]. Because the distributions of scores are not normal (see the supplementary materials), we also conducted the nonparametric Wilcoxon rank sum test, which similarly showed no significant difference between the two samples, W = 5423, p = 0.959.
Political party affiliation and religiosity.
As stated in the introduction, scientific literacy does not account for all of the variance in public acceptance (or rejection) of science: people’s values and worldviews are extremely influential in their acceptance or rejection of scientific findings. Therefore, we also asked about political party affiliation and religiosity.
To capture political party affiliation (i.e., party), we asked participants, “generally speaking, do you consider yourself a …” with the following response options: strong Democrat, Democrat, independent, Republican, strong Republican, other, and “I choose not to answer.” Because many of the flat Earth conference attendees, whom we interviewed in person for a separate study, vociferously rejected affiliating with any political party, we realized the importance of including unaffiliated (or no answer and refusal to answer) as a possible response option, particularly when sampling conspiracy-minded individuals who are suspicious of institutions like political parties. It is very common for research studies to use listwise deletion, analyzing only participants for which they have complete data. However, we believe that this would lead to a loss of many of the participants with the strongest conspiracy mentality who refuse to answer the political party question. Therefore, we treated party as a categorical variable.
To reduce the number of comparison groups, we combined strong Democrat and Democrat into one response level, combined strong Republican and Republican into one response level, kept independent as one response level, and combined other (n = 30) and prefer not to answer (n = 44, including people who left the item blank) into one response level. The resulting variable was categorical with four levels: Democrat, independent, Republican, and unaffiliated/other. Among the national sample, 31% were coded as Democrat (n = 158), 33% were coded as independent (n = 172), 25% were coded as Republican (n = 130), and 11% were coded as unaffiliated/other (n = 66). Among the FE sample, 5% (n = 1) were coded as Democrat, 5% (n = 1) were coded as Republican, 14% (n = 3) were coded as independent, and 76% (n = 16) were coded as unaffiliated/other. A chi-square goodness-of-fit test showed that the distribution of party affiliation differed between the two samples, χ2(3) = 91.47, p < .001.
Religiosity was assessed by asking participants how much guidance faith or religion provide in their day-to-day lives (0 = not religious, 1= none at all, 2 = a little, 3 = a moderate amount, 4 = a lot, 5 = a great deal). The median religiosity response for the FE sample was “a lot” (FE sample: M = 3.0, SD = 1.89), whereas the national sample’s median was “a moderate amount” (national sample: M = 2.61, SD = 1.69). An independent samples nonparametric test suggests that the two samples did not differ statistically in their religiosity (W=4276, p=.362).
Lastly, to measure conspiracy mentality, we used a modified version of the Conspiracy Theory Questionnaire.Reference Bruder and Manstead34 Our version of the scale consisted of seven conspiracy theories ranging from prototypical conspiracies (e.g., the Apollo program never landed on the moon) to more recent ones (e.g., Barack Obama was not born in the United States). Participants were asked to rate each item on a four-point scale (1 = definitely false, 2 = likely false, 3 = likely true, 4 = definitely true). The seven items were internally consistent (Cronbach’s alpha = 0.69, 95% CI [0.65, 0.73]). On average, the national sample rated items around “likely false” (M = 2.31, SD = 0.46) and the FE sample rated items around “likely true” (M = 3.36, SD = 0.30). We used a graded response model (using the ltm package in RReference Rizopoulos60) to calculate participants’ scores, and then we centered them. Scores ranged from –2.21 to 2.56 (M = 0, SD = 0.87), with higher numbers indicating stronger conspiracy mentality. As anticipated, the FE sample (M = 1.54, SD = 0.59) scored much higher on this measure of conspiracy mentality than the national sample (M = –0.06, SD = 0.79), t(23.06) = 11.99, p < .001, Cohen’s d = 2.67, 95% CI [2.20, 3.13].
To test our hypotheses, we merged the data from the two samples. Then we conducted general linear model (GLM) analyses (controlling for sample: FE versus national) and report significance based on type III tests. For more details about the analysis, please see the supplementary materials.
The data sets and coding used for the this article are available as a component on our project page on the Open Science Framework at https://osf.io/4pa96/.
Rejection of well-supported scientific theories
Much of the literature on rejection of science has highlighted and found interactions between science literacy (or other types of reasoning abilities such as “actively open-minded thinking” or “need for cognition”) and worldviews such as political ideology for denial of climate change and religiosity for denial of human evolution. Thus, to examine the potential influence of conspiracy mentality on rejection of scientific facts, we incorporated the following as predictors for the base model: science literacy, party (referent = Democrat), religiosity, and two interactions, one between science literacy and one between science literacy and religion. Then, we ran the model a second time, adding conspiracy mentality and an interaction between conspiracy mentality and party (to test the conditional effect found by Uscinski and OlivellaReference Uscinski and Olivella27). We report the deviance between the two models and the effects from the second model (see also the supplementary materials for more detailed results).
Rejection of human-caused climate change.
As stated earlier, none of the 21 participants in the FE sample believed that climate change is real and human caused—100% rejected it. In contrast, 64% of the online panel (n = 326 of 513) accepted anthropogenic climate change, whereas only 36% rejected it. Given that the FE sample had a stronger conspiracy mentality, these frequencies appear to support findings from prior literature that conspiracy mentality is positively related to climate change denial.Reference Lewandowsky, Oberauer and Gignac25, Reference Lewandowsky, Gignac and Oberauer41 However, when the two samples were combined, conspiracy mentality did not significantly predict climate change rejection (b = 0.13, χ2 = 0.16, p = .686), and adding conspiracy mentality to the model only marginally improved its fit (deviance = 9.39, p = .052).
Although conspiracy mentality did not predict rejection of climate change when controlling for the other variables in the model, sample (FE versus national) did, b = –19.18, χ2 = 35.28, p < .001. We should note, however, that although there was a significant difference between Democrats and the unaffiliated/other in how conspiracy mentality relates to the rejection of climate change (b = –1.56, p = .018), simple effects tests with Bonferroni correction (adjusting the significance threshold, or cutoff p value, to 0.013 for the four comparisons) showed no significant relationship between conspiracy mentality and rejecting climate change for Democrats or for unaffiliated/other (see Figure 1).
The GLM analysis found the expected robust interaction between science literacy and party, consistent with prior research.Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel11 For Democrats, the probability of rejecting anthropogenic climate change decreased with increasing science literacy, whereas the opposite was true for Republicans (b = 1.25, p = .008). The odds of rejecting climate change for a Republican who scored lower on science literacy (when science literacy = –1) was 249% greater than for a Democrat with the same science literacy score. Polarization between the two political parties on climate change was even larger, however, among partisans with higher science literacy: the odds of rejecting climate change for a Republican who scored higher on science literacy (when science literacy = +1) was 4105% greater than for a Democrat with the same science literacy score (see Figure 2).
Rejection of human evolution
As for climate change, the effect of conspiracy mentality on the rejection of evolution did not reach statistical significance (b = –0.09, χ2 = 0.10, p = .076); however, this relationship was conditional on political party (χ2 = 10.33, p = .016), and the variable’s addition to the model significantly improved the model fit (deviance = 13.27, p = .010). Simple effects tests with Bonferroni correction (adjusting the cutoff p value to .013) suggest that the effect of conspiracy mentality on rejecting evolution was marginal for Republicans (b = 0.57, p = .018) and significant for unaffiliated/other (b = 1.66, p < .001), but it was not significant for Democrats (b = 0.41, p = .067) or independents (b = 0.19, p = .350). Specifically, among those in our sample with lower conspiracy mentality (when conspiracy mentality = –1), the odds of Republicans rejecting evolution were 33% greater than those of Democrats. In contrast, among those in our sample with higher conspiracy mentality (when conspiracy mentality = +1), the odds of Republicans rejecting evolution were 856% greater than Democrats (see Table 1 and Figure 3).
Notes: Party is treated as a categorical variable with Democrat as the referent. Response level significance for this variable is reported based on summary output from the GLM, whereas variable level significance is reported based on type III tests. Asterisks mark statistical significance. Coefficients (b) are not standardized.
*** p < .001;
** p < .01;
* p < .05;
tp < .10.
Susceptibility to Viral Deception about Science
Our second aim focused on who is susceptible to viral deception about science. We hypothesized that conspiracy mentality would predict the extent to which people endorsed the deceptive claims as true and that individuals’ priors (science literacy, political party affiliation, and religiosity) would predict endorsement of these claims after accounting for conspiracy mentality. The claims were scored like the conspiracy theories (1 = definitely false to 4 = definitely true). The results of the regression analyses are reported in Table 2. For a more detailed table (with exact p-values and sums of squares), see the supplementary materials. In addition, after reporting the GLM analyses results, we report the results of a test of relative importance using a method devised by Lindeman, Merenda, and Gold,Reference Lindeman, Merenda and Gold61 which averages the sequential sums of squares (type 1) across all of the orderings of the regressors with the relaimpo packageReference Grömping62 in R. This analysis allows us to determine which factors, or their interactions, explain the largest proportions of response variance or are the most “important” relative to the other predictors in the model.
Notes: Party is treated as a categorical variable with Democrat as the referent. Response level significance for this variable is reported based on summary output from the GLM, whereas variable level significance is reported based on type III tests. Coefficients (b) are not standardized.
*** p < .001;
** p < .01;
* p < .05;
tp < .10.
Claim that the cure for cancer is being suppressed.
A common deceptive claim is that a cure for most types of cancer has already been found, but medical circles prefer to keep getting research funding from governments and keep their findings secret. Greater conspiracy mentality and lower science literacy predicted endorsing this claim as more likely to be true. There was also a marginal interaction effect of science literacy and political party: the relationship between science literacy and evaluating the claim as true was conditional on political party, with Democrats being marginally different from Republicans and significantly different from the unaffiliated/other. Follow-up simple effects tests show that for Democrats and independents, greater scientific literacy led to endorsing the claim as more likely to be false (Democrats: b = –0.66, p < .001; independents: b = –0.50, p < .001). In contrast, science literacy did not significantly predict endorsement of the claim for the unaffiliated/other (b = 0.04, p = .809), and for Republicans, the negative relationship was marginal (b = –0.25, p = .062).
Claim that GMOs cause cancer and corporations are covering it up.
Another common deceptive claim propagated by untrustworthy websites is that GMOs cause cancer and agricultural biotechnology corporations, such as Monsanto, are covering it up. For this item, conspiracy mentality indeed predicted evaluating this claim as likely true. Moreover, there was a significant interaction of conspiracy mentality and science literacy. Among those with lower conspiracy mentality, higher science literacy predicted evaluating the claims as more likely to be false. In contrast, among those with higher conspiracy mentality, higher science literacy predicted evaluating the claims as more likely to be true (see Figure 4).
Claim that that the Zika virus was caused by a genetically modified mosquito.
Another deceptive claim contends that the genetically modified mosquito, which was developed at least in part to help curb the spread of diseases like Zika (see https://www.oxitec.com/friendly-mosquitoes/), is actually the underlying cause of the Zika virus. As expected, higher conspiracy mentality and lower science literacy strongly predicted believing the claim that the Zika virus is caused by the genetically modified mosquito. No other effects or interactions were significant (see Table 2).
Claim that childhood vaccines are unsafe and cause disorders like autism.
One of the most common assertions about vaccinations among deceptive websites is that childhood vaccinations are unsafe and cause disorders such as autism. As with the previous items, greater conspiracy mentality and lower science literacy significantly predicted evaluations that this claim is likely to be true. In addition, people who reported stronger religiosity were also likely to evaluate this claim as more likely to be true, which is consistent with prior work.63 This was also the only claim for which sample remained a significant predictor after accounting for other effects (see Table 2).
Relative importance of the factors.
Using the results of the GLM analyses for each of the claims, we conducted a test of relative importance of the variables. As we stated earlier, we used the relaimpo package in R and report the results of the lmg analyses, which average the sequential sums of squares over all orders of the regressors. This analysis produces a value that represents the proportion of response variance for which each of the factors accounts. Graphing these results, we can see the relative importance of each of the factors included in the model. This analysis and Figure 5 more clearly illustrate our findings from the GLM analyses, that conspiracy mentality and science literacy were the most important predictors (relative to the others included in the model). For a table with the exact values from the lmg analysis, see the supplementary materials.
The role of conspiracy mentality
This study primarily set out to examine the potential role of conspiracy mentality in predicting two phenomena: the rejection of well-supported scientific theories and the acceptance of viral deception about science.
We found mixed evidence that conspiracy mentality predicts rejection of science. Although conspiracy mentality was influential in rejection of evolution contingent on political party affiliation (e.g., the relationship was positive and marginal for Republicans and significant for the unaffiliated/other category), it did not meaningfully predict rejection of climate change. Two things are important to note here, however. First, although we found a significant interaction indicating that the degree (and/or direction) of the relationship between conspiracy mentality and rejection of climate change differs by political party affiliation (i.e., it is conditional on political party, somewhat consistent with work by Uscinski and OlivellaReference Uscinski and Olivella27), simple effects tests suggest that the relationship between conspiracy mentality and rejection of climate change is not significant for any of the party affiliations (see Figure 1).
Second, none of the Flat Earth International Conference attendees—who scored significantly higher on conspiracy mentality than the national sample, endorsed human-caused climate change as true. Thus, it is inaccurate to say that our findings are completely inconsistent with prior work that has shown relationships between conspiracy mentality and rejection of climate change. Instead, we question the robustness of the findings from prior work; certain changes to the way in which conspiracy mentality or climate change beliefs are measured may alter the strength and existence of the relationship.
Related to this point, one strength of our study is that we included a subsample of individuals with high conspiracy mentality, or so-called conspiracy theorists. Although we recognize that our subsample is not representative of all conspiracy theorists, especially as these participants are subscribers to flat Earth ideology and not all conspiracy theorists are flat Earthers, we did find that all the flat Earthers surveyed rejected the existence of anthropogenic climate change (and human evolution). It is possible, for instance, that the relationship between conspiracy mentality and climate change rejection (when measured as a continuous variable) is not linear. Future research should continue to test this hypothesis using samples of individuals with strong conspiracy mentalities (i.e., among populations of conspiracy theorists) and test whether a relationship between conspiracy mentality and rejection of climate change is a continuous relationship or one that, for the most part, appears only after crossing a certain threshold.
There are other reasons, too, why our results may differ from previous studies examining the relationship between conspiracy mentality and science denial. For one, our one-item measurement of climate change acceptance is not sensitive and does not allow for much variance in views about climate change. However, it can be argued that a particularly robust effect of conspiracy mentality on the denial of climate change ought to be present when simply asking participants whether they believe that climate change is occurring. For example, the robust interaction of knowledge and political ideology persists across different measurements (or operationalizations) of climate change views, science knowledge, and political party and ideology. Of course, we do not mean that we doubt the existence of any effect of conspiracy mentality on climate change denial; we simply question the power and persistence of such an effect.
Believing viral deception.
Our second aim examined susceptibility to believing viral deception about science. Our hypothesis that conspiracy mentality would predict endorsement of these claims was supported, and conspiracy mentality was the most important predictor of susceptibility in our model (see Figure 5). However, we were also interested in whether individuals’ prior values and beliefs predicted acceptance of the deceptive claims even after accounting for conspiracy mentality. Indeed, even though the number of individuals with pathological levels of conspiracy mentality is arguably small, viral fake news campaigns are dangerous because people who may not be conspiracy oriented are predisposed to accept conspiracies that support their worldviews.
What makes these viral deceptive claims different than typical conspiracy theories is the number of people who believe them. On average, very few people endorse most conspiracy theories (with notable exceptions like the conspiracy theories surrounding the assassination of President John F. KennedyReference Koerth-Baker38). On the other hand, many of our participants believed the deceptive claims about GMOs and Zika. About 56% of our national sample said it is likely or definitely true that Monsanto is covering up for the fact that GMOs cause cancer, and 32% of our national sample said that it is likely or definitely true that the Zika virus is caused by the genetically modified mosquito. Future research should measure whether believing viral deception leads to later rejection of science communication about those topics and related policy efforts, such as blocking the release of a new Food and Drug Administration–approved genetically modified food product or protesting the release of transgenic mosquitoes in areas at high risk of Zika, dengue, or malaria.
The role of science literacy
Aside from conspiracy mentality, only one other individual factor was consistently relevant to predicting rejection of well-supported scientific theories and accepting viral deception about science: science literacy. First, and expectedly, we found more evidence for the robust interaction effect between science literacy (measured here using a shortened version of Kahan’s OSI scale) and political ideology on the rejection/acceptance of anthropogenic climate change. In contrast to other work that has treated political ideology as a continuous variable, we looked at political party affiliation as a categorical one so as not to lose participants who choose not to affiliate. As expected, the relationship of science literacy and acceptance of the scientific consensus on climate change and evolution was conditional on political party affiliation. That is, Democrats and Republicans polarized along science literacy; with increasing science literacy, Democrats were more likely (and Republicans were less likely) to accept that human-caused climate change is a real phenomenon. Interestingly, people who refused to answer the political affiliation item (or said that they do not affiliate with the listed political parties) showed a similar pattern to Republicans and those who reported being independent showed a similar pattern to Democrats (see Figure 2).
Moreover, when it came to predicting evaluations of the deceptive claims as likely to be true, science literacy was the only factor in our model besides conspiracy mentality that appeared to meaningfully predict each of the four deceptive claims. Unlike when predicting rejection of science, however, we did not consistently find conditional effects of the relationship of science literacy on acceptance of the deceptive claims. While there was a marginal interaction effect of science literacy and political party affiliation on evaluating the cancer cure suppression item, simple effects tests showed that greater science literacy predicted evaluating the claim as more likely to be false among Democrats, independents, and Republicans (though the effect for Republicans was marginal). There was no relationship between science literacy and evaluating the claim for unaffiliated/others. For the “GMOs cause cancer” item, there was an interaction effect of science literacy and conspiracy mentality. Among those with lower conspiracy mentality, having greater science literacy led to evaluating the claims as more likely to be false. In contrast, among those with higher conspiracy mentality, having greater science literacy led to evaluating the claims as more likely to be true. The effect of science literacy on evaluation of the other two deceptive claims (about autism and about vaccines) was not conditional on another value or identity factor that we measured. Thus, implementing interventions to increase science literacy may be influential in preventing proliferation of viral deception about science, at least among the majority of the population.
This study, like many others, has limitations that ought be taken into consideration when interpreting its findings. First, although the sample is much more diverse than typical convenience samples (e.g., undergraduate students) or samples using Amazon’s Mechanical Turk, it is not nationally representative or probabilistic. Also, because this was a secondary analysis of a larger study aiming to examine the relationship between conspiracy mentality and science curiosity, the survey included only a few questions about science-related beliefs and acceptance/rejection of scientific facts and the analysis was exploratory. These points aside, we were still able to examine some of the issues that are more prevalent in today’s media environment: climate change and evolution, and we were able to examine fake news headlines that have “gone viral” on social media. Future studies should aim to replicate our findings here with different samples and should consider asking participants about a broader array of scientific beliefs including controversial and noncontroversial issues.
The proliferation of deceptive claims on social media has done a lot to normalize conspiracy, and to some extent conspiratorial worldviews. We can try to dismiss conspiracy theorizing as something undertaken only by a foil-hat-wearing fringe, however when our friends and neighbors (and sometimes ourselves) begin to believe and share conspiracies on social media, we must acknowledge that conspiracy theorizing is much more widespread. And when it becomes commonplace to project conspiratorial motives onto scientific institutions (and not just corporate or governmental ones) merely because information disagrees with our worldviews, we are in danger of entering into a space where knowledge becomes almost completely relative, we cannot engage in rational discussion with those with whom we disagree, and we completely break down the division of cognitive labor on which our society relies. Although we should not be gullible—after all, there are real conspiracies—we must learn how to balance skepticism with trust.
We would like to thank the organizers of the International Flat Earth Conference for allowing us to interview conference attendees and request email addresses from attendees to participate in our online survey. In addition, we are grateful to Tim Linksvayer, Deena Weisberg, Michael Weisberg, Stephan Lewandowsky, Cam Stone, and Rosalynn Vasquez for providing feedback on earlier versions of the manuscript. Finally, we would like to thank the Science Communication and Cognition Lab team members for their help and support.
To view supplementary material for this article, please visit http://dx.doi.org/10.1017/pls.2019.9.