Hostname: page-component-68c7f8b79f-lqrcg Total loading time: 0 Render date: 2026-01-02T12:30:19.065Z Has data issue: false hasContentIssue false

Information Disorder and Global Politics

Published online by Cambridge University Press:  20 November 2025

Julia C. Morse*
Affiliation:
Department of Political Science, University of California, Santa Barbara, USA
Tyler Pratt
Affiliation:
Department of Political Science, University of North Carolina, Chapel Hill, USA
*
*Corresponding author: Julia C. Morse; Email: jcmorse@ucsb.edu

Abstract

Information is a key variable in International Relations, underpinning theories of foreign policy, inter-state cooperation, and civil and international conflict. Yet IR scholars have only begun to grapple with the consequences of recent shifts in the global information environment. We argue that information disorder—a media environment with low barriers to content creation, rapid spread of false or misleading material, and algorithmic amplification of sensational and fragmented narratives—will reshape the practice and study of International Relations. We identify three major implications of information disorder on international politics. First, information disorder distorts how citizens access and evaluate political information, creating effects that are particularly destabilizing for democracies. Second, it damages international cooperation by eroding shared focal points and increasing incentives for noncompliance. Finally, information disorder shifts patterns of conflict by intensifying societal cleavages, enabling foreign influence, and eroding democratic advantages in crisis bargaining. We conclude by outlining an agenda for future research.

Information

Type
Short Essay — Future IR
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The IO Foundation

The global information environment is undergoing a profound transformation. Nontraditional news outlets, citizen reporting, and user-generated social media content increasingly compete with, and sometimes eclipse, news from legacy media. In theory, this abundance of voices could expand citizens’ access to reliable information. In practice, however, social media algorithms and platforms privilege sensationalist stories that maximize attention capture. Content that elicits anger or emotional outrage is amplified, creating openings for misinformation to spread quickly online.Footnote 1 Even traditional news outlets sometimes rely on clickbait headlines to generate interest, as many individuals consume and share stories without ever reading the underlying content.Footnote 2

This new information economy has ushered in an era of “information disorder”Footnote 3 that directly affects how citizens experience politics. Misinformation flourishes as social media amplifies false or inaccurate claims from citizens and political elites. Disinformation circulates freely, with strategic actors sharing “falsehoods spread as news stories … to advance political goals.”Footnote 4 And malinformation, true information presented in a way that misleads, becomes a pervasive feature of politics. Artificial intelligence (AI) may intensify such dynamics, making it easier to manipulate opinions through deepfakes, reinforcing information silos, and targeting inflammatory content directly to individuals’ newsfeeds.

The emergence of populist leaders like Donald Trump both reflects and accelerates these trends. Such leaders are particularly astute at using sensationalist claims to dominate the information environment. For example, Trump administration officials have exaggerated trade imbalances and lied about tariff revenue to justify tariffs on foreign countries;Footnote 5 administration officials have also parroted Russian disinformation,Footnote 6 calling Ukrainian President Zelenskyy a “dictator”Footnote 7 and refusing to acknowledge that Russia started the war.Footnote 8

US officials have also taken actions that reduce the normative and regulatory constraints on information disorder. President Trump and his allies describe fact checking as censorship of right-wing speech, prompting companies like Meta, Google, and LinkedIn to shift away from such efforts.Footnote 9 The administration has also rolled back AI safety measures and opposed efforts to regulate the tech industry.Footnote 10 Without meaningful safeguards, competition between big tech companies is likely to accelerate informational challenges.

The rise of information disorder presents a challenge for International Relations (IR) scholarship, which has long emphasized information as a key driver of political outcomes. Canonical models of inter-state conflict and cooperation emphasize information asymmetries, signaling, and the provision of credible information.Footnote 11 Theories of democratic advantages in foreign policy require citizens to observe and process information in order to hold their governments accountable.Footnote 12 But in an environment rife with digital falsehoods and message silos, these informational foundations are eroded.

This essay argues that information disorder is likely to reshape a variety of important processes in international politics. After describing the emergence of this new environment and situating it in the literature, we examine three consequences for international relations. First, by disrupting how citizens access and evaluate information, information disorder may disproportionately destabilize democratic states. Second, information disorder may undermine international cooperation by degrading shared focal points, weakening institutional legitimacy, and lowering the reputational costs of noncompliance. Finally, information disorder may increase conflict, as it intensifies social cleavages, facilitates foreign influence, and erodes democratic advantages in crisis bargaining. We conclude by outlining an agenda for future scholarship.

How Production and Consumption Dynamics Create Information Disorder

Here, we use information disorder to refer to a media environment in which barriers to producing and circulating content are minimal, false or misleading material diffuses rapidly through digital platforms, and attention-economy dynamics amplify sensational, emotionally charged, and fragmented narratives at the expense of shared factual baselines. This environment began to emerge in the 2010s as social media platforms displaced traditional information gatekeepers. It has intensified in the 2020s as algorithmic curation and generative AI accelerate the creation and spread of false or polarizing content.

While the speed and reliability of information flows have been a concern for decades, two recent structural shifts distinguish information disorder from prior eras. First, the production of media content has become radically decentralized. Smartphones, social digital platforms, and AI allow a large and varied set of actors to create and disseminate information. As a result, the volume of global information flows has reached unprecedented levels.Footnote 13 Scholars have documented rising levels of misinformation and disinformation, especially on social media platforms where users create and share content via peer-to-peer networks.Footnote 14 In countries where traditional media outlets were unreliable or captured by the state, the decentralization of news may expand access to political information and strengthen democratic accountability.Footnote 15 In established democracies, however, a proliferation of media sources may expose citizens to more unreliable content than in prior eras.

A second structural shift concerns changes in media consumption patterns. Individuals increasingly use digital platforms and social media to retrieve information.Footnote 16 Rather than actively seeking information through newspapers or scheduled broadcasts, they encounter news passively through algorithmic feeds, notifications, and peer sharing.Footnote 17 Incidental exposure fragments news diets, as individuals skim headlines and snippets rather than reading full articles.Footnote 18 These habits, in turn, feed “attention-economy” dynamics that determine which stories are disseminated and seen.Footnote 19 Engagement-driven algorithms prioritize content that is novel, emotionally arousing, and polarizing.Footnote 20 In part because they elicit surprise and outrage, false or sensational claims often diffuse more widely than verified reports. At the same time, personalized algorithms curate individualized feeds, reinforcing pre-existing preferences and limiting cross-cutting exposure.Footnote 21

These changes in production and consumption create a fundamentally distinct information environment from previous eras. Individuals are bombarded by a much higher volume and variety of media content, increasing uncertainty and disengagement.Footnote 22 Rather than tuning into an evening newscast, citizens passively consume information curated by algorithms, which amplify misinformation, reward provocative content, and encourage polarized news consumption.

This dynamic has substantial implications for domestic and international politics. Citizens in countries subject to information disorder learn, perceive, and evaluate political information in new ways. While the implications of information disorder for political elites are less clear,Footnote 23 at a minimum, some leaders have learned to leverage these dynamics for political gain domestically and abroad.Footnote 24

Political elites, national governments, and IOs generally agree that these structural transformations represent a qualitatively new and potentially dangerous development. The United Nations Secretary General has warned that threats to information integrity are “proliferating and expanding with unprecedented speed on digital platforms, supercharged by artificial intelligence technologies.”Footnote 25 Other prominent IOs, including the World Health Organization (WHO) and North Atlantic Treaty Organization (NATO), have similarly raised the alarm.Footnote 26 Disinformation is now so common that in January 2025, the World Economic Forum labeled it the biggest short-term global risk.Footnote 27 But while the prevalence of information disorder is clearly documented, scholars have only just begun to probe how it will affect key aspects of international politics.

Research on Information and International Politics

International Relations scholars have long pointed to information as a key driver of inter-state conflict and cooperation. Structural realists argued that uncertainty about others’ intentions and capabilities is a fundamental constraint on state behavior.Footnote 28 According to this view, the information-poor international environment limits cooperation and encourages competition among states, which can never fully trust one another.

Subsequent research reconceptualized information not as a fixed condition, but as a key variable. Scholars demonstrated how states can reduce uncertainty by drawing inferences from past behavior,Footnote 29 constructing international laws and institutions,Footnote 30 and leveraging costs to credibly signal their intentions to others.Footnote 31 In addition to these institutional mechanisms, scholars also highlighted how governments tie their hands to overcome information asymmetries.Footnote 32 By endogenously generating information, states can reduce conflict and sustain cooperation.

The emergence of the Internet and digital communication technologies in the 1990s transformed international politics from an information-poor to an information-abundant environment. Research identified how this shift empowers actors and institutions that can cut through the noise. Transnational activist networks, for example, mobilize pressure on target states by quickly collecting, distilling, and leveraging information.Footnote 33 States and IOs leverage performance indicators, rankings, and scorecards to focus international attention and drive policy change.Footnote 34 This scholarship highlighted the importance of messages that capture the attention of key audiences.

The contemporary environment of information disorder reflects a new phase in this evolution, demanding renewed scholarly engagement. International politics today is characterized not merely by abundance, but by the proliferation of information sources, a dramatic rise in persuasive misinformation and disinformation, and a shift to passive and fragmented news consumption patterns. IR scholars have only begun to explore the consequences of this transformed landscape. Recent empirical work documents how state actors wield information and secrecy as a foreign policy toolFootnote 35 and suggests governments can evade accountability by discrediting undesirable facts as misinformation.Footnote 36 Other work illuminates how misinformation shapes public attitudes and trustFootnote 37 and documents variation in the ability of fact-checking interventions to rebut these effects.Footnote 38 Comparatively less attention has been paid to how these dynamics might shape central questions, including the stability and vulnerability of different regime types, multilateral cooperation, and inter-state conflict. We develop our argument about these issues in the next three sections.

Regime Type and Asymmetric Vulnerability to Information Disorder

We argue that information disorder is likely to produce asymmetric effects both across and within countries. Democratic governance depends on the public learning about a leader’s policy choices, assessing consequences, and rewarding or punishing accordingly. This accountability system is central to both domestic and international politics; IR theory often points to a “democratic advantage” whereby democratic regimes enjoy more credibility, consistency, or restraint in foreign affairs precisely because they face domestic oversight.Footnote 39

When citizens are inundated with misinformation, fragmented narratives, and competing realities, the informational foundation of these mechanisms begin to erode. To sanction a leader for misbehavior, citizens must be exposed to new, potentially counterattitudinal information. Yet if individuals rely on social media as their primary news source, they are far more likely to view content that reinforces existing beliefs instead.Footnote 40 Exposure to misinformation may also distort how citizens understand government policies, performance, and levels of popular support,Footnote 41 although the size of this impact continues to be a subject of debate.Footnote 42 Rising uncertainty, confusion, and misperception can weaken the political constraints that citizens impose on democratic leaders and create opportunities for elites to manipulate information.

Within democracies, populist politicians are particularly well suited to leverage information disorder for political gain.Footnote 43 Populists are often inherently skeptical of traditional media outlets and well versed in exploiting alternative messaging, and social media makes it easy to spread antiestablishment narratives.Footnote 44 Such dynamics may increase the political prospects for populist leaders, compared to traditional candidates.

While information disorder poses challenges for all democracies, its effects will depend on a country’s pre-existing media environment. If the previous media infrastructure was heavily siloed or censored, a flood of new media sources could enhance accountability. Access to social media platforms could also help citizens coordinate behavior and communicate with elected officials.Footnote 45 For democracies with a more transparent and mature pre-existing media environment, however, information disorder may destabilize traditional accountability mechanisms.

Autocratic countries are comparatively resilient to such dynamics. Countries like Russia and China already control the flow of information to their populations, filtering out negative content and reinforcing government narratives.Footnote 46 While the Internet and social media may increase exposure to alternative sources, autocratic governments may also be particularly adept at deploying such tools to cement control. The Chinese government, for example, uses regime-directed social media posts to distract the public from political issues,Footnote 47 and has directed Chinese AI developers to build systems that espouse government lines on issues like the Tiananmen Square Massacre and Taiwan.Footnote 48

Authoritarian states may also have offensive advantages in weaponizing information disorder against other countries. Domestic content tools can easily be turned transnational, used to intervene in other states and counter negative information abroad. China has used disinformation throughout the Asia-Pacific region, working to discredit Taiwanese politicians who support independence and to drive a wedge in US–Philippine relations.Footnote 49 Russia has targeted former Soviet states and Balkan countries for disinformation campaigns,Footnote 50 as well as the United States. These types of foreign information campaigns are easier to launch against democracies, where transparency and freedom of speech make it more challenging to protect against such content.

Institutionalized Cooperation Amid Information Disorder

A second major implication of information disorder pertains to international cooperation between states. We focus on three potential pathways that may affect cooperation: diminished focal points, increased incentives for noncompliance, and undermined trust and legitimacy.

Information disorder may weaken the ability of member states to negotiate international rules and standards that provide focal points for state behavior. Many international institutions are designed to coordinate expectations and reduce information asymmetries.Footnote 51 International rules provide legal and normative guideposts that influence not only government policies but also how citizens understand what constitutes acceptable government behavior.Footnote 52 But establishing these rules requires reaching a broad consensus about the nature of a specific problem and the requisite policy solution, and information disorder has the potential to undermine this process. It increases the viability of populist candidates, who may withhold or misreport scientific information and make elite-level cooperation more difficult.Footnote 53 Information disorder may also fragment how citizens in different countries understand shared challenges and threats, making it more difficult for governments to pursue cooperative solutions on salient issues.Footnote 54

Consider the case of climate change, a cooperation problem where misinformation and misunderstandings have long been obstacles to collective action. While the Intergovernmental Panel on Climate Change (IPCC) has worked to build consensus on the science of climate change, its 2022 report notes that rising flows of misinformation are undermining the transmission of climate science to general populations.Footnote 55 For many years, fossil fuel companies and a handful of anti-climate scientists used misinformation to sow doubt, engender ideological polarization, and impede policy action;Footnote 56 however, with the advent of social media, they can now reach considerably more people. In 2016, for example, one of the most popular climate stories on social media falsely alleged that tens of thousands of scientists had declared global warming a hoax.Footnote 57 In a prior information environment, the story may have struggled to gain traction; in 2016, it was shared more than half a million times. US polling data across time is also suggestive of information silos: although more Americans now believe there is a scientific consensus on global warming, Republican voters have changed their views considerably less than independents or nonpartisans.Footnote 58 Because citizen support for curbing emissions is crucial for policy action in democracies, escalating flows of misinformation hurt both domestic and international progress.

A second way that information disorder may impact international cooperation is by reducing a state’s incentives to comply with international rules. States are motivated to meet international commitments, in part, because of the political and reputational backlash that accompanies noncompliance.Footnote 59 Information disorder allows leaders to fabricate circumstances that justify violations. Bolstering such claims with false news stories, doctored videos, and misleading facts may help shirk political costs that otherwise accompany rule-breaking behavior. Recent research shows that governments can mitigate domestic and international backlash by inventing persuasive pretexts for violations.Footnote 60 By undermining shared factual baselines, information disorder diminishes the normative and reputational constraints that encourage compliance.

Finally, information disorder may erode public trust in IOs, weakening their ability to operate in politically sensitive or contested arenas. Many IOs rely on public trust and legitimacy to deliver programmatic services. Examples include global health initiatives that surveil and treat infectious diseases, election observation missions that assess the fairness of democratic processes, and peacekeeping missions designed to stabilize postconflict settings. These functions require that the IO be perceived as credible and impartial in order to acquire local support and secure participation from member states.

Information disorder threatens these core sources of legitimacy. When misinformation and disinformation circulate widely—casting doubt on IO motives, spreading conspiracy theories, or misrepresenting IO activities—they undermine an IO’s credibility. Even false claims, when repeated or amplified in fragmented media environments, can sow suspicion and mistrust. As a result, IOs face more resistance from host governments, struggle to build local cooperation, and find their messaging dismissed due to perceived bias.

IO peacekeeping missions offer a clear illustration of these dynamics. Peace operations are deployed in highly contested environments where trust is scarce, making “local legitimacy” a crucial resource for IOs.Footnote 61 In addition to monitoring ceasefires and protecting civilians, peacekeeping missions must help local actors overcome commitment problems and mutual suspicion toward each other.Footnote 62 These efforts are increasingly undermined by disinformation campaigns designed to delegitimize IO interventions and deepen divisions between parties.Footnote 63 False allegations suggesting that UN peacekeepers are trafficking weapons, supporting terrorists, and exploiting natural resources have damaged UN peacekeeping missions in the Central African Republic, Mali, and the Democratic Republic of the Congo.Footnote 64

Implications for Conflict Emergence and Escalation

Information disorder also may also have implications for conflict emergence and escalation. We focus specifically on mechanisms related to public attitudes and perceptions. A significant body of scholarship links public opinion and conflict, suggesting three key outcomes that could be affected by information disorder: intensified societal cleavages, enhanced opportunities for foreign influence campaigns, and eroded democratic advantages in crisis bargaining.

First, information disorder could increase the likelihood of civil conflicts by inflaming hostility and making it easier for strategic actors to sow internal discord. Research on political violence and mass atrocities finds that falsehoods, selective facts, and deliberate mischaracterizations of events are often used to inspire hate and fear, creating an environment more conducive to violence.Footnote 65 Social media platforms are likely to exacerbate such dynamics: more active social media use is associated with an increased number and severity of civil conflicts,Footnote 66 while exposure to hate speech online is associated with higher levels of hate crimes.Footnote 67 Governments looking to target an ethnic minority or marginalized population can exploit unconventional media channels to sow the seeds for violence. In Myanmar, for example, disinformation on Facebook played a significant role in state-sponsored violence against the Rohingya,Footnote 68 and pro-military actors continue to weaponize social media platforms to silence dissent.Footnote 69 Internal conflicts can easily spill over into neighboring states, leading to broader political crises.Footnote 70

Second, foreign governments can capitalize on information disorder to destabilize or sow discord. The digital technology revolution of the 1990s and 2000s introduced cyber-attacks as a form of nontraditional warfare, allowing cross-border interventions with little attribution. Directed disinformation campaigns are another such development. Technological advances like generative AI make it easier for foreign governments to create persuasive disinformation, and the social media ecosystem facilitates the viral spread of these messages. While online disinformation campaigns may not directly change foreign policy alignments,Footnote 71 they can affect public opinion,Footnote 72 exacerbate social and political cleavages, and heighten overall cynicism about politics.Footnote 73

Third, the erosion of democratic accountability discussed before may weaken democratic advantages in international conflict and crisis bargaining. Democratic peace theorists have long argued that democracies are less likely to fight wars with each other, in part due to electoral accountability.Footnote 74 Scholars also theorize that democratic states are able to send more credible signals in crisis bargaining because leaders face domestic costs from backing downFootnote 75 or from threatening force in the first place.Footnote 76 If information disorder disrupts domestic accountability in democratic states, we are likely to see these advantages in conflict resolution and crisis bargaining deteriorate.

Charting a Research Agenda

We conclude by outlining a forward-looking research agenda for International Relations scholars. We focus in particular on unresolved questions that will shape the scope, intensity, and trajectory of information disorder and international politics in the years to come.

A key area of uncertainty is how information disorder affects the beliefs and attitudes of political elites. Elites generally have higher levels of education, news literacy, and political knowledge, which should help them identify falsehoods and protect against misperceptions.Footnote 77 Political leaders have access to intelligence reporting and other formal vetting channels, which could insulate them from false beliefs. In practice, however, leaders sometimes cling to misinformed beliefs that contradict intelligence community findings.Footnote 78 Strong partisan attachments among elites may also offset knowledge and access to information, as individuals engage in motivated reasoning.Footnote 79 Future research should more precisely identify the conditions under which elite perceptions are shaped by information disorder dynamics and examine the extent to which this emerging information environment impacts how leaders make foreign policy decisions.

More broadly, scholars should investigate how susceptibility to information disorder varies across ideological, institutional, and geographic dimensions. Studies find that right-wing actors spread more misinformation than left-wing counterparts, though whether this reflects demand, supply, or underlying psychology remains contested.Footnote 80 Globally, the impact of disinformation also often diverges. For example, Russian narratives about Ukraine fell flat in much of the West but resonated in parts of the Global South.Footnote 81 Political, social, and media systems most likely impact how information disorder affects foreign policy. Future research should disentangle the asymmetries between left and right, Global North and Global South, or open and closed systems to better explain why some societies are more vulnerable than others.

Another set of questions concerns the durability of this information environment. Shifts in media production and consumption could alter the expectations described before, as could further advancements in AI. On the one hand, synthetic media (for example, deepfake videos) is likely to accelerate many of the processes described earlier. But AI is also shaping information consumption in ways that could generate unanticipated effects. Chatbots that deliver news might offer a bulwark against misinformation, since they tend to summarize the conventional wisdom. Alternatively, chatbots’ tendencies toward sycophancy could exacerbate media fragmentation and entrench misperceptions, as systems deliver content that conforms with users’ pre-existing biases.

Scholars should also consider how institutions and norms endogenously adapt to information disorder. Structural shifts in the information environment may prompt unanticipated changes in social norms, behavioral patterns, or even cognitive capacity.Footnote 82 Concerns about the political impacts of information disorder could trigger countermovements to combat these effects. Recent AI regulation initiatives in fora like the EU and G7 suggest the issue is entering the global governance agenda.Footnote 83 Similarly, domestic political procedures could emerge to slow or even reverse the dynamics described before.

Finally, there is a clear need for more empirical testing of the effects of information disorder on international politics. Scholars have made substantial progress tracking the spread of mis- and disinformation, as well as analyzing their effects on perceptions of the mass public. We have comparatively less evidence about how these dynamics filter into foreign policy decisions, inter-state bargaining, global governance, or other important processes in IR. Observational studies can leverage temporal shifts in the information environment as well as variation across political systems, states, and leaders to investigate these effects.

Embarking on these efforts, scholars should consider their own roles and responsibilities while operating in a fragmented information landscape. The rise of information disorder is not just a scholarly puzzle but also a political crisis. Understanding how the modern information environment shapes international outcomes is a first step, but we must also identify institutions, norms, and strategies resilient enough to withstand misinformation and disinformation. Scholars must also consider how they communicate their findings if they are to leverage their expertise and break through the noise in a world of information disorder

Acknowledgments

We thank Ashley Anderson, Caitlin Andrews-Lee, Bruce Bimber, Jeff Colgan, Michael Goldfien, Brian Guay, Amanda Kennard, Junghyun Lim, Neil O’Brian, Duane Morse, and Jakob Wiedekind, as well as anonymous reviewers and the editors of International Organization for valuable feedback on this paper.

Footnotes

1 McLoughlin and Brady Reference McLoughlin and Brady2024.

3 Wardle and Derakhshan Reference Wardle and Derakhshan2017 use the term “information disorder” to describe three types of informational challenges: misinformation, disinformation, and malinformation (the public release of private information). We use the term more broadly to include a media environment with low barriers to content creation, rapid spread of false or misleading material, and algorithmic amplification of sensational narratives.

4 Bennett and Livingston Reference Lance and Livingston2018, 123.

7 O’Grady, Stern, and Morgunov Reference O’Grady, Stern and Morgunov2025.

11 See, for example, Keohane Reference Keohane1984; Fearon Reference Fearon1995.

12 Putnam Reference Putnam1988; Schutlz 1998; Martin Reference Martin2000.

13 Data from the global DE-CIX Internet exchange indicates global data traffic more than doubled since 2020; aggregate estimates suggest over 400 million terabytes of new data are generated daily (Intelligent CIO North America 2025).

17 Hermida Reference Hermida2010.

23 Existing studies find that political elites, like the mass public, engage in motivated reasoning and can hold systematic misperceptions, though higher levels of knowledge can sometimes reduce susceptibility to misinformation (Broockman and Skovron Reference Broockman and Skovron2018; Baekgaard et al. 2019; Christensen and Moynihan Reference Christensen and Moynihan2024; Pennycook and Rand Reference Pennycook and Rand2019).

25 IISD 2024.

26 World Health Organization 2025; NATO 2024.

27 World Economic Forum 2025.

33 Keck and Sikkink Reference Keck and Sikkink1998.

34 Kelley and Simmons Reference Kelley and Simmons2015, Reference Kelley and Simmons2019; Cooley and Snyder Reference Cooley and Snyder2015; Kelley Reference Kelley2017; Morse Reference Morse2019, Reference Morse2022. Such indicators may also be subject to bias and political manipulation (for example, Colgan Reference Colgan2019).

35 See, for example, Guess et al. Reference Guess, Lyons, Montgomery, Nyhan and Reifler2019a; Carnegie and Carson Reference Carnegie and Carson2020.

36 Schiff, Schiff, and Bueno Reference Schiff, Schiff and Bueno2025.

43 Guriev and Papaioannou Reference Guriev and Papaioannou2022.

45 Larreguy and Raffler Reference Larreguy and Raffler2025.

46 King, Pan, and Roberts Reference King, Pan and Roberts2013.

47 King, Pan, and Roberts Reference King, Pan and Roberts2017.

50 Thomas and Franca Reference Thomas and Franca2025.

51 Keohane Reference Keohane1984.

52 Chilton Reference Chilton2014; Kuzushima, McElwain, and Shiraito Reference Kuzushima, McElwain and Shiraito2023.

53 Carnegie, Clark, and Zucker Reference Carnegie, Clark and Zucker2024.

55 IPCC 2022, 58.

57 Lewandowsky, Ecker, and Cook Reference Lewandowsky, Ecker and Cook2017.

58 Data from the Yale Program on Climate Change Communication, available at <https://climatecommunication.yale.edu/visualizations-data/americans-climate-views/>.

62 Ruggeri, Gizelis, and Dorussen Reference Ruggeri, Gizelis and Dorussen2013.

64 Trithart Reference Trithart2022.

65 Badar and Florijančič Reference Badar and Florijančič2020; Albader Reference Albader2022.

66 Hunter and Biglaiser Reference Hunter and Biglaiser2022.

67 Wahlström, Törnberg, and Ekbrand Reference Wahlström, Törnberg and Ekbrand2021.

68 United Nations Human Rights Council 2018.

69 Advox 2023.

70 Salehyan and Gleditsch Reference Salehyan and Skrede Gleditsch2006.

71 Lanoszka Reference Lanoszka2019.

72 For example, a 2025 YouGov survey found that more than half of US respondents believe at least some Kremlin-originated disinformation about Ukraine (Woollacott Reference Woollacott2025).

74 Russett Reference Russett1994.

75 See, for example, Fearon Reference Fearon1994; Schultz Reference Schultz1998; Smith Reference Smith1998; Tomz Reference Tomz2007b, among others.

76 Kertzer and Brutger Reference Kertzer and Brutger2016.

78 For example, President Trump has openly questioned conclusion of the US intelligence community on Russian interference in the 2016 election, COVID-19 origins, and Saudi Arabia’s assassination of Jamal Khashoggi.

82 Shanmugasundaram and Tamilarasu Reference Shanmugasundaram and Tamilarasu2023.

83 Cupać and Sienknecht Reference Cupać and Sienknecht2024.

References

Advox (Global Voices). 2023. How Military Supporters Are Using Telegram Channels to Suppress Dissent in Myanmar. Global Voices Advox, 30 January. Available at <https://advox.globalvoices.org/2023/01/30/how-military-supporters-are-using-telegram-channels-to-suppress-dissent-in-myanmar/>..>Google Scholar
Allcott, Hunt, Braghieri, Luca, Eichmeyer, Sarah, and Gentzkow, Matthew. 2020. The Welfare Effects of Social Media. American Economic Review 110 (3):629–76.CrossRefGoogle Scholar
Albader, Fatemah. 2022. Disinformation on Social Media: A Risk Factor for Mass Atrocity Occurrence. ILSA Journal of International and Comparative Law 29:117.Google Scholar
Ashley, Seth, Craft, Stephanie, Maksl, Adam, Tully, Melissa, and Vraga, Emily K.. 2023. Can News Literacy Help Reduce Belief in COVID Misinformation? Mass Communication and Society 26 (4):695719.CrossRefGoogle Scholar
Axelrod, Robert. 1981. The Emergence of Cooperation Among Egoists. American Political Science Review 75 (2):306318.CrossRefGoogle Scholar
Bækgaard, Martin, Christensen, Julian, Dahlmann, Casper Mondrup, Mathiasen, Asbjørn Lanng, and Bjørn Grund Petersen, Niels. 2019. The Role of Evidence in Politics: Motivated Reasoning and Persuasion Among Politicians. British Journal of Political Science 49 (3):1117–40.CrossRefGoogle Scholar
Baum, Matthew A., and Potter, Philip B.K.. 2015. War and Democratic Constraint: How the Public Influences Foreign Policy. Princeton University Press.Google Scholar
Badar, Mohamed, and Florijančič, Polona. 2020. Assessing Incitement to Hatred As a Crime Against Humanity of Persecution. International Journal of Human Rights 24 (5):656–87.CrossRefGoogle Scholar
Bakshy, Eytan, Messing, Solomon, and Adamic, Lada A.. 2015. Exposure to Ideologically Diverse News and Opinion on Facebook. Science 348 (6239):1130–32.CrossRefGoogle ScholarPubMed
Lance, Bennett, W., and Livingston, Steven. 2018. The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions. European Journal of Communication 33 (2):122–39.Google Scholar
Bowles, Jeremy, Croke, Kevin, Larreguy, Horacio, Liu, Shelley, and Marshall, John. 2025. Sustaining Exposure to Fact-Checks: Misinformation Discernment, Media Consumption, and Its Political Implications. American Political Science Review, published online.CrossRefGoogle Scholar
Brady, William J., Wills, Julian A., Jost, John T., Tucker, Joshua A., and Van Bavel, Jay J.. 2017. Emotion Shapes the Diffusion of Moralized Content in Social Networks. Proceedings of the National Academy of Sciences 114 (28):7313–18.CrossRefGoogle ScholarPubMed
Broockman, David E., and Skovron, Christopher. 2018. Bias in Perceptions of Public Opinion Among Political Elites. American Political Science Review 112 (3):542–63.CrossRefGoogle Scholar
Brulle, Robert J. 2014. Institutionalizing Delay: Foundation Funding and the Creation of US Climate Change Counter-Movement Organizations. Climatic Change 122 (4):681–94.CrossRefGoogle Scholar
Carnegie, Allison, and Carson, Austin. 2020. Secrets in Global Governance: Disclosure Dilemmas and the Challenge of International Cooperation. Cambridge Studies in International Relations. Vol. 154. Cambridge University Press.Google Scholar
Carnegie, Allison, Clark, Richard, and Zucker, Noah. 2024. Global Governance under Populism: The Challenge of Information Suppression. World Politics 76 (4):639–66.CrossRefGoogle Scholar
Chilton, Adam S. 2014. The Influence of International Human Rights Agreements on Public Opinion: An Experimental Study. Chicago Journal of International Law 15 (1).Google Scholar
Christensen, Julian, and Moynihan, Donald P.. 2024. Motivated Reasoning and Policy Information: Politicians Are More Resistant to Debiasing Interventions than the General Public. Behavioural Public Policy 8 (1):4768.CrossRefGoogle Scholar
Ciampaglia, Giovanni Luca, Flammini, Alessandro, and Menczer, Filippo. 2015. The Production of Information in the Attention Economy. Scientific Reports 5:9452.CrossRefGoogle ScholarPubMed
Colgan, Jeff D. 2019. American Bias in Global Security Studies Data. Journal of Global Security Studies 4 (3):358–71.CrossRefGoogle Scholar
Cooley, Alexander , and Snyder, Jack, eds. 2015. Ranking the World: Grading States As a Tool of Global Governance. Cambridge University Press.CrossRefGoogle Scholar
Cupać, Jelena, and Sienknecht, Mitja. 2024. Regulate Against the Machine: How the EU Mitigates AI Harm to Democracy. Democratization 31 (5):1067–90.CrossRefGoogle Scholar
Dunlap, Riley E., and McCright, Aaron M.. 2011. Organized Climate Change Denial. In The Oxford Handbook of Climate Change and Society, edited by John, S. Dryzek, Richard, B. Norgaard, and Schlosberg, David, 144–60. Oxford University Press.Google Scholar
Eady, Gregory, Paskhalis, Tom, Zilinsky, Jan, Bonneau, Richard, Nagler, Jonathan, and Tucker, Joshua A.. 2023. Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 US Election and Its Relationship to Attitudes and Voting Behavior. Nature Communications 14:62.CrossRefGoogle Scholar
Farrell, Justin. 2016. Corporate Funding and Ideological Polarization about Climate Change. Proceedings of the National Academy of Sciences 113 (1):92–7.CrossRefGoogle ScholarPubMed
Fearon, James D. 1994. Domestic Political Audiences and the Escalation of International Disputes. American Political Science Review 88 (3):577–92.CrossRefGoogle Scholar
Fearon, James D. 1995. Rationalist Explanations for War. International Organization 49 (3):379414.CrossRefGoogle Scholar
Fearon, James D. 1997. Signaling Foreign Policy Interests: Tying Hands Versus Sinking Costs. Journal of Conflict Resolution 41 (1):6890.CrossRefGoogle Scholar
Gould, Joe. 2025. Defense Deputy Secretary Pick Avoids Saying Russia Invaded Ukraine. Politico, 25 February. Available at <https://www.politico.com/news/2025/02/25/pentagon-defense-russia-ukraine-00206008>..>Google Scholar
Greene, Kevin T. 2024. Partisan Differences in the Sharing of Low-Quality News Sources by US Political Elites. Political Communication 41 (3):373–92.CrossRefGoogle Scholar
Guay, Brian, Berinsky, Adam J., Pennycook, Gordon, and Rand, David G.. 2023. How to Think About Whether Misinformation Interventions Work. Nature Human Behaviour 7:1231–33.CrossRefGoogle ScholarPubMed
Guay, Brian, Berinsky, Adam J., Pennycook, Gordon, and Rand, David G.. 2025. Examining Partisan Asymmetries in Fake News Sharing and the Efficacy of Accuracy Prompt Interventions. Journal of Politics (forthcoming). Preprint available at OSF: <https://osf.io/preprints/psyarxiv/y762k_v1>.Google Scholar
Guess, Andrew M., Lyons, Benjamin A., Montgomery, Jacob M., Nyhan, Brendan, and Reifler, Jason. 2019a. Fake News, Facebook Ads, and Misperceptions: Assessing Information Quality in the 2018 US Midterm Election Campaign. Public Report, February. Available at <https://sites.dartmouth.edu/nyhan/files/2021/03/fake-news-2018.pdf>..>Google Scholar
Guess, Andrew, Nagler, J., and Tucker, J.. 2019b. Less Than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook. Science Advances 5 (1).CrossRefGoogle ScholarPubMed
Guess, Andrew, Nyhan, Brendan, and Reifler, Jason. 2020. Exposure to Untrustworthy Websites in the 2016 US Election. Nature Human Behavior 4:472–80.CrossRefGoogle ScholarPubMed
Guess, A.M., Malhotra, N., Pan, J., Barberá, P., Allcott, H., Brown, T., Crespo-Tenorio, A., Dimmery, D., Freelon, D., Gentzkow, M., and González-Bailón, S.. 2023a. How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign? Science 381 (6656):398404.CrossRefGoogle Scholar
Guess, A.M., Malhotra, N., Pan, J., Barberá, P., Allcott, H., Brown, T., Crespo-Tenorio, A., Dimmery, D., Freelon, D., Gentzkow, M., and González-Bailón, S.. 2023b. Reshares on Social Media Amplify Political News But Do Not Detectably Affect Beliefs or Opinions. Science 381 (6656):404408.CrossRefGoogle ScholarPubMed
Guriev, Sergei, and Papaioannou, Elias. 2022. The Political Economy of Populism. Journal of Economic Literature 60 (3):753832.CrossRefGoogle Scholar
Guriev, Sergei, Melnikov, Nikita, and Zhuravskaya, Ekaterina. 2021. 3G Internet and Confidence in Government. The Quarterly Journal of Economics 136 (4):2533–613.CrossRefGoogle Scholar
Guzman, Andrew T. 2008. How International Law Works: A Rational Choice Theory. Oxford University Press.CrossRefGoogle Scholar
Hameleers, Michael, Brosius, Anna, Marquart, Franziska, Goldberg, Andreas C., van Elsas, Erika, and de Vreese, Claes H.. 2022. Mistake or Manipulation? Conceptualizing Perceived Mis- and Disinformation Among News Consumers in Ten European Countries. Communication Research 49 (7):919–41.CrossRefGoogle Scholar
Hermida, Alfred. 2010. Twittering the News: The Emergence of Ambient Journalism. Journalism Practice 4 (3):297308.CrossRefGoogle Scholar
Hunter, Lance Y., and Biglaiser, Glen. 2022. The Effects of Social Media, Elites, and Political Polarization on Civil Conflict. Studies in Conflict and Terrorism. 48 (10):1127–64.CrossRefGoogle Scholar
Huszár, Ferenc, Ktena, Sofia Ira, O’Brien, Conor, Belli, Luca, Schlaikjer, Andrew, and Hardt, Moritz. 2022. Algorithmic Amplification of Politics on Twitter. Proceedings of the National Academy of Sciences 119 (1):p.e2025334119.CrossRefGoogle ScholarPubMed
Huth, Paul K., Croco, Sarah E., and Appel, Benjamin J.. 2013. Bringing Law to the Table: Legal Claims, Focal Points, and the Settlement of Territorial Disputes Since 1945. American Journal of Political Science 57 (1):90103.CrossRefGoogle Scholar
IISD. 2024. UN Launches Global Principles for Information Integrity. IISD SDG Knowledge Hub, 12 July. Available at <https://sdg.iisd.org/news/un-launches-global-principles-for-information-integrity/>..>Google Scholar
Intelligent CIO North America. 2025. Global Data Traffic Volume Hits New Record-Breaking High at Internet Exchanges. 22 January. Available at <https://www.intelligentcio.com/north-america/2025/01/22/global-data-traffic-volume-hits-new-record-breaking-high-at-internet-exchanges/>..>Google Scholar
Intergovernmental Panel on Climate Change (IPCC). 2022. Climate Change 2022: Mitigation of Climate Change. Technical Summary. Contribution of Working Group III to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press. Available at <https://www.ipcc.ch/report/ar6/wg3/downloads/report/IPCC_AR6_WGIII_TechnicalSummary.pdf>>Google Scholar
Jerit, Jennifer, and Zhao, Yangzi. 2020. Political Misinformation. Annual Review of Political Science 23 (1):7794.CrossRefGoogle Scholar
Jones-Jang, S. Mo, Mortensen, Tara, and Liu, Jingjing. 2021. Does Media Literacy Help Identification of Fake News? Information Literacy Helps, but Other Literacies Don’t. American Behavioral Scientist 65 (2):371–88.CrossRefGoogle Scholar
Kapstein, Ethan B. 1994. Governing the Global Economy: International Finance and the State. Harvard University Press.Google Scholar
Keck, Margaret E., and Sikkink, Kathryn. 1998. Activists Beyond Borders: Advocacy Networks in International Politics. Cornell University Press.Google Scholar
Kelley, Judith G. 2017. Scorecard Diplomacy: Grading States to Influence Their Reputation and Behavior. Cambridge University Press.CrossRefGoogle Scholar
Kelley, Judith G., and Simmons, Beth A.. 2015. Politics by Number: Indicators as Social Pressure in International Relations. American Journal of Political Science 59 (1):5570.CrossRefGoogle Scholar
Kelley, Judith G., and Simmons, Beth A.. 2019. Introduction: The Power of Global Performance Indicators. International Organization 73 (3):491510.CrossRefGoogle Scholar
Keohane, Robert O. 1984. After Hegemony: Cooperation and Discord in the World Political Economy. Princeton University Press.Google Scholar
Keohane, Robert O. 1986. Reciprocity in International Relations. International Organization 40 (1):127.CrossRefGoogle Scholar
Kertzer, Joshua D., and Brutger, Ryan. 2016. Decomposing Audience Costs: Bringing the Audience Back into Audience Cost Theory. American Journal of Political Science 60 (1):234–49.CrossRefGoogle Scholar
King, Gary, Pan, Jennifer, and Roberts, Margaret E.. 2013. How Censorship in China Allows Government Criticism but Silences Collective Expression. American Political Science Review 107 (2):326–43.CrossRefGoogle Scholar
King, Gary, Pan, Jennifer, and Roberts, Margaret E.. 2017. How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument. American Political Science Review 111 (3):484501.CrossRefGoogle Scholar
Kuzushima, Saki, McElwain, Kenneth Mori, and Shiraito, Yuki. 2024. Public Preferences for International Law Compliance: Respecting Legal Obligations or Conforming to Common Practices? The Review of International Organizations 19 (1):6393.CrossRefGoogle Scholar
Kyrychenko, Yara, Koo, Hyunjin J., Maertens, Rakoen, Roozenbeek, Jon, van der Linden, Sander, and Götz, Friedrich M.. 2025. Profiling Misinformation Susceptibility. Personality and Individual Differences 241:113–77.CrossRefGoogle Scholar
Lanoszka, Alexander. 2019. Disinformation in International Politics. European Journal of International Security 4 (2):227–48.CrossRefGoogle Scholar
Larreguy, Horacio, and Raffler, Pia J.. 2025. Accountability in Developing Democracies: The Impact of the Internet, Social Media, and Polarization. Annual Review of Political Science 28 (1): 413–34.CrossRefGoogle Scholar
Lasser, Jana, Aroyehun, Segun Taofeek, Simchon, Almog, Carrella, Fabio, Garcia, David, and Lewandowsky, Stephan. 2022. Social Media Sharing of Low-Quality News Sources by Political Elites. PNAS Nexus 1 (4):186.CrossRefGoogle ScholarPubMed
Levy, Ro’ee. 2021. Social Media, News Consumption, and Polarization: Evidence from a Field Experiment. American Economic Review 111 (3):831–70.CrossRefGoogle Scholar
Lewandowsky, Stephan, Ecker, Ullrich K.H., and Cook, John. 2017. Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition 6 (4):353–69.CrossRefGoogle Scholar
Martin, Lisa L. 2000. Democratic Commitments: Legislatures and International Cooperation. Princeton University Press.Google Scholar
McLoughlin, Killian L., and Brady, William J.. 2024. Human-Algorithm Interactions Help Explain the Spread of Misinformation. Current Opinion in Psychology 56:101770.CrossRefGoogle Scholar
Morgenthau, Hans J. 1948. Politics Among Nations: The Struggle for Power and Peace. Alfred A. Knopf.Google Scholar
Morse, Julia C. 2019. Blacklists, Market Enforcement, and the Global Regime to Combat Terrorist Financing. International Organization 73 (3):511–45.CrossRefGoogle Scholar
Morse, Julia C. 2022. The Bankers’ Blacklist: Unofficial Market Enforcement and the Global Fight Against Illicit Financing. Cornell University Press.Google Scholar
Morse, Julia C. 2025. Information Fragmentation and Global Governance in Hard Times. Ethics and International Affairs 39 (2):159–72CrossRefGoogle Scholar
Morse, Julia C., and Pratt, Tyler. 2022. Strategies of Contestation: International Law, Domestic Audiences, and Image Management. Journal of Politics 84 (4):2080–93.CrossRefGoogle Scholar
Morse, Julia C., and Pratt, Tyler. 2025. Smoke and Mirrors: Strategic Messaging and the Politics of Noncompliance. American Political Science Review, First View.CrossRefGoogle Scholar
Mosleh, Mohsen, and Rand, David G.. 2022. Measuring Exposure to Misinformation from Political Elites on Twitter. Nature Communications 13:7144.CrossRefGoogle ScholarPubMed
Muhammed, Sadiq T., and Mathew, Saji K.. 2022. The Disaster of Misinformation: A Review of Research in Social Media. International Journal of Data Science and Analytics 13 (4):271–85.CrossRefGoogle Scholar
NATO. 2024. Setting the Record Straight: De-bunking Russian Disinformation on NATO. 24 October. Available at <https://www.nato.int/cps/en/natohq/115204.htm>..>Google Scholar
Newman, Nic, Fletcher, Richard, Robertson, Craig T., Arguedas, Amy Ross, and Kleis Nielsen, Rasmus. 2024. Reuters Institute Digital News Report 2024. Reuters Institute for the Study of Journalism.Google Scholar
Nomikos, William G. 2025. Local Peace, International Builders: How UN Peacekeeping Builds Peace from the Bottom Up. Cambridge University Press.CrossRefGoogle Scholar
Nyhan, Brendan, Porter, Ethan, Reifler, Jason, and Wood, Thomas J.. 2020. Taking Fact-Checks Literally But Not Seriously? The Effects of Journalistic Fact-Checking on Factual Beliefs and Candidate Favorability. Political Behavior 42 (2):939–60.CrossRefGoogle Scholar
Nyhan, Brendan, and Reifler, Jason. 2010. When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior 32 (2):303–30.CrossRefGoogle Scholar
Ognyanova, Katherine, Lazer, David, Robertson, Ronald E., and Wilson, Christo. 2020. Misinformation in Action: Fake News Exposure Is Linked to Lower Trust in Media, Higher Trust in Government When Your Side Is in Power. Harvard Kennedy School Misinformation Review 1 (4).Google Scholar
O’Grady, Siobhán, Stern, David L., and Morgunov, Serhiy. 2025. Echoing Kremlin, Trump Calls Zelensky a Dictator, Angering Ukrainians. The Washington Post, 19 February. Available at <https://www.washingtonpost.com/world/2025/02/19/ukraine-russia-trump-elections-zelensky/>.Google Scholar
Oreskes, Naomi, and Conway, Erik M.. 2010. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. Bloomsbury Press.Google Scholar
Paletz, Susannah B.F., Johns, Michael A., Murauskaite, Egle E., Golonka, Ewa M., Pandža, Nick B., Anton Rytting, C., Buntain, Cody, and Ellis, Devin. 2023. Emotional Content and Sharing on Facebook: A Theory Cage Match. Science Advances 9 (39):eade9231.CrossRefGoogle ScholarPubMed
Partington, Richard. 2025. Fact Check: Are US Tariffs Really Bringing in $2b a Day as Trump Claims? The Guardian, 9 April. Available at <https://www.theguardian.com/us-news/2025/apr/09/fact-check-donald-trump-tariffs-revenue≥.Google Scholar
Pennycook, Gordon, and Rand, David G.. 2019. Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning Than by Motivated Reasoning. Cognition 188:3950.CrossRefGoogle ScholarPubMed
Peterson, Erik, and Kagalwala, Ali. 2021. When Unfamiliarity Breeds Contempt: How Partisan Selective Exposure Sustains Oppositional Media Hostility. American Political Science Review 115 (2):585–98.CrossRefGoogle Scholar
Pollet, Mathieu. 2025. Fact-Checkers Under Fire As Big Tech Pulls Back. Politico Europe, 28 January. Available at <https://www.politico.eu/article/fact-checkers-under-fire-meta-big-tech-censorship-mark-zuckerberg-donald-trump/>..>Google Scholar
Powell, Emilia Justyna, and Mitchell, Sara McLaughlin. 2007. The International Court of Justice and the World’s Three Legal Systems. The Journal of Politics 69 (2):397415.CrossRefGoogle Scholar
Presl, Dominik. 2024. Russia Is Winning the Global Information War. RUSI Commentary, 7 May. Available at <https://www.rusi.org/explore-our-research/publications/commentary/russia-winning-global-information-war>..>Google Scholar
Putnam, Robert D. 1988. Diplomacy and Domestic Politics: The Logic of Two-Level Games. International Organization 42 (3):427–60.CrossRefGoogle Scholar
Qiu, Linda. 2025. Trump Has Long Misled with Claims About Global Trade and Tariffs. New York Times, 11 April. Available at <https://www.nytimes.com/article/trump-tariffs-trade-economy-fact-check.html>..>Google Scholar
Ruggeri, Andrea, Gizelis, Theodora-Ismene, and Dorussen, Han. 2013. Managing Mistrust: An Analysis of Cooperation with UN Peacekeeping in Africa. Journal of Conflict Resolution 57 (3):387409.CrossRefGoogle Scholar
Russett, Bruce 1994. Grasping the Democratic Peace: Principles for a Post–Cold War World. Princeton University Press.Google Scholar
Rydén, Pernilla, Laine, Sanni, Ahlfors, Susanna, Pylyser, Benoît, and Alberoth, Jonas. 2023. Tackling Mis- and Disinformation: Seven Insights for UN Peace Operations. SIPRI WritePeace Blog, 4 October. Available at <https://www.sipri.org/commentary/blog/2023/tackling-mis-and-disinformation-seven-insights-un-peace-operations>..>Google Scholar
Salehyan, Idean, and Skrede Gleditsch, Kristian. 2006. Refugees and the Spread of Civil War. International Organization 60 (2):335–66.CrossRefGoogle Scholar
Schelling, Thomas C. 1960. The Strategy of Conflict. Harvard University Press.Google Scholar
Schiff, Kaylyn Jackson, Schiff, Daniel S., and Bueno, Natália S.. 2024. The Liar’s Dividend: Can Politicians Claim Misinformation to Evade Accountability? American Political Science Review 119 (1):7190.CrossRefGoogle Scholar
Schultz, Kenneth A. 1998. Domestic Opposition and Signaling in International Crises. American Political Science Review 92 (4):829–44.CrossRefGoogle Scholar
Shanmugasundaram, Mathura, and Tamilarasu, Arunkumar. 2023. The Impact of Digital Technology, Social Media, and Artificial Intelligence on Cognitive Functions: A Review. Frontiers in Cognition 2:1203077.CrossRefGoogle Scholar
Simmons, Beth A. 2000. International Law and State Behavior: Commitment and Compliance in International Monetary Affairs. American Political Science Review 94 (4):819–35.CrossRefGoogle Scholar
Smith, Alastair. 1998. International Crises and Domestic Politics. American Political Science Review 92 (3):623–38.CrossRefGoogle Scholar
Smith, Adam. 2025. How Are Trump’s Policies Affecting Global AI Safety Laws? Context News, 5 March. Available at <https://www.context.news/ai/how-are-trumps-policies-affecting-global-ai-safety-laws>..>Google Scholar
Sprick, Daniel. 2025. Aligning AI with China’s Authoritarian Value System. The Diplomat, 3 February. Available at <https://thediplomat.com/2025/02/aligning-ai-with-chinas-authoritarian-value-system/>.Google Scholar
Sundar, S. Shyam, Snyder, Eugene Cho, Liao, Mengqi, Yin, Junjun, Wang, Jinping, and Chi, Guangqing. 2025. Sharing without Clicking on News in Social Media. Nature Human Behaviour 9:156–68.CrossRefGoogle ScholarPubMed
Sundar, S. Shyam, Lin, Ruoyun, and Kang, Hyunjin. 2024. News Sharing Without Reading: Prevalence and Consequences on Facebook. Nature Human Behaviour 8:10241035.Google Scholar
Sunstein, Cass R. 2017. #Republic: Divided Democracy in the Age of Social Media. Princeton University Press.CrossRefGoogle Scholar
Thomas, James, and Franca, Talyta. 2025. Which European Countries Are Most Exposed to Russian Disinformation? Euronews, 18 April. Available at <https://www.euronews.com/my-europe/2025/04/18/which-european-countries-are-most-exposed-to-russian-disinformation>..>Google Scholar
Tomz, Michael. 2007a. Reputation and International Cooperation: Sovereign Debt Across Three Centuries. Princeton University Press.Google Scholar
Tomz, Michael. 2007b. Domestic Audience Costs in International Relations: An Experimental Approach. International Organization 61 (4):821–40.CrossRefGoogle Scholar
Törnberg, Petter, and Chueri, Juliana. 2025. When Do Parties Lie? Misinformation and Radical-Right Populism Across Twenty-six Countries. The International Journal of Press/Politics, advance online publication. Available at <doi:10.1177/19401612241311886>.CrossRefGoogle Scholar
Trithart, Albert. 2022. Disinformation against UN Peacekeeping Operations. International Peace Institute Issue Brief, November. Available at <https://www.ipinst.org/2022/11/disinformation-against-un-peacekeeping-operations>..>Google Scholar
United Nations Human Rights Council. 2018. Report of the Independent International Fact-Finding Mission on Myanmar. A/HRC/39/64, 12 September. Geneva: United Nations.Google Scholar
Vaccari, Cristian, and Chadwick, Andrew. 2020. Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News. Social Media + Society 6 (1):113.CrossRefGoogle Scholar
Vázquez-Herrero, Jorge, Negreira-Rey, María-Cruz, and Sixto-García, José. 2022. Mind the Gap! Journalism on Social Media and News Consumption Among Young Audiences. International Journal of Communication 16:4424–44.Google Scholar
Ventura, Tiago, Majumdar, Rajeshwari, Nagler, Jonathan, and Tucker, Joshua. Forthcoming. Misinformation Beyond Traditional Feeds: Evidence from a WhatsApp Deactivation Experiment in Brazil. The Journal of Politics. Available at <https://www.journals.uchicago.edu/doi/10.1086/737172>..>Google Scholar
Voo, Julia. 2024. Driving Wedges: China’s Disinformation Campaigns in the Asia-Pacific. In Asia-Pacific Regional Security Assessment 2024, edited by the International Institute for Strategic Studies (IISS), chap. 5. IISS/Routledge. Available at <https://www.iiss.org/publications/strategic-dossiers/asia-pacific-regional-security-assessment-2024/chapter-5/>.Google Scholar
Vosoughi, Soroush, Roy, Deb, and Aral, Sinan. 2018. The Spread of True and False News Online. Science 359 (6380):1146–51.CrossRefGoogle ScholarPubMed
Wahlström, Mattias, Törnberg, Anton, and Ekbrand, Hans. 2021. Dynamics of Violent and Dehumanizing Rhetoric in Far-Right Social Media. New Media and Society 23 (11):3290–311.CrossRefGoogle Scholar
Waltz, Kenneth N. 1979. Theory of International Politics. Addison-Wesley.Google Scholar
Wardle, Claire, and Derakhshan, Hossein. 2017. Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Strasbourg: Council of Europe.Google Scholar
Whalan, Jeni. 2017. The Local Legitimacy of Peacekeepers. Journal of Intervention and Statebuilding 11 (3):306320.CrossRefGoogle Scholar
Williams, Emily L., Bartone, Sydney A., Swanson, Emma K., and Stokes, Leah C.. 2022. The American Electric Utility Industry’s Role in Promoting Climate Denial, Doubt, and Delay. Environmental Research Letters 17 (10):104026.CrossRefGoogle Scholar
Wolf, Zachary B. 2025. Zelensky Invades Trump’s Disinformation Space—and It Could Really Hurt Ukraine. CNN, 1 March. Available at <https://edition.cnn.com/2025/02/28/politics/zelensky-trump-russia-ukraine-war>..>Google Scholar
Woollacott, Emma. 2025. Americans Believe Russian Disinformation “To Alarming Degree.” Forbes, 24 April. Available at <https://www.forbes.com/sites/emmawoollacott/2025/04/22/americans-believe-russian-disinformation-to-alarming-degree/>..>Google Scholar
World Economic Forum. 2025. Global Risks Report 2025: Conflict, Environment and Disinformation Top Threats. World Economic Forum Press Release, 15 January. Available at <https://www.weforum.org/press/2025/01/global-risks-report-2025-conflict-environment-and-disinformation-top-threats/>..>Google Scholar
World Health Organization. 2025. Combatting Misinformation Online. WHO Digital Health and Innovation Team, World Health Organization. Available at <https://www.who.int/teams/digital-health-and-innovation/digital-channels/combatting-misinformation-online>..>Google Scholar
Xu, Jinghong, He, Ziyu, Guo, Difan, and Ding, Ying. 2024. Relationship Between News Overload and News Avoidance: A Meta-Analysis. Journalism, advance online publication. Available at <doi: 10.1177/14648849241299667>.CrossRef.>Google Scholar
Zhuravskaya, Ekaterina, Petrova, Maria, and Enikolopov, Ruben. 2020. Political Effects of the Internet and Social Media. Annual Review of Economics 12:415–38.CrossRefGoogle Scholar