Hostname: page-component-848d4c4894-mwx4w Total loading time: 0 Render date: 2024-06-14T10:59:27.663Z Has data issue: false hasContentIssue false

Neurotechnology and international security

Predicting commercial and military adoption of brain-computer interfaces (BCIs) in the United States and China

Published online by Cambridge University Press:  08 February 2022

Margaret Kosal*
Georgia Institute of Technology, USA
Joy Putney
Georgia Institute of Technology, USA
Corresponding author: Margaret Kosal, Sam Nunn School of International Affairs, Georgia Institute of Technology, Atlanta, Georgia, USA. Email:


In the past decade, international actors have launched “brain projects” or “brain initiatives.” One of the emerging technologies enabled by these publicly funded programs is brain-computer interfaces (BCIs), which are devices that allow communication between the brain and external devices like a prosthetic arm or a keyboard. BCIs are poised to have significant impacts on public health, society, and national security. This research presents the first analytical framework that attempts to predict the dissemination of neurotechnologies to both the commercial and military sectors in the United States and China. While China started its project later with less funding, we find that it has other advantages that make earlier adoption more likely. We also articulate national security risks implicit in later adoption, including the inability to set international ethical and legal norms for BCI use, especially in wartime operating environments, and data privacy risks for citizens who use technology developed by foreign actors.

Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (, which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
© The Author(s), 2022. Published by Cambridge University Press on behalf of the Association for Politics and the Life Sciences


The human brain, with its approximately 100 billion interconnected neurons, enables our higher cognitive abilities without being a physiological outlier from other primate species (Herculano-Houzel, Reference Herculano-Houzel2012). This incredibly complex organ can also host debilitating clinical disorders, including seizures, depression and anxiety, Parkinson’s disease, and many others. Unlocking the secrets of the human brain is one of the largest scientific challenges to ever be undertaken. Advances in neuroscience research have enabled us to further our understanding of the brain and develop technologies to read and write brain activity, with implications for understanding behavior and decision-making.

In the past decade, the United States and the People’s Republic of China have begun large neuroscience research projects along with several other international actors, including Canada, Korea, Japan, Australia, and the European Union (EU) (International Brain Initiative, n.d.). These brain initiatives have ambitious goals that are only now possible with advances in genetic tools and imaging techniques (National Institutes of Health, n.d.c). The U.S. BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative has a stated goal of “accelerating the development and application of innovative technologies … to produce a revolutionary new dynamic picture of the brain that … shows how individual cells and complex neural circuits interact in both time and space” (National Institutes of Health, n.d.b). Additionally, the BRAIN Initiative was highlighted as a major component of the United States’ innovation strategy for breakthrough technologies (Office of Science and Technology Policy, 2015). The China Brain Project’s framework focuses on the ability to develop technologies both for diagnosis and treatment of brain disorders and for mimicking human intelligence and connecting humans and machines (Poo et al., Reference Poo, Du, Ip, Xiong, Xu and Tan2016).

While the initial public focus of these brain projects has been clinical technologies to cure disease and return faculties to people, such research is likely to enable augmentation and use for healthy people in both the commercial and military sectors. The U.S. BRAIN Initiative and the China Brain Project emphasize the importance of neuroscience research to treat and prevent brain disorders in aging populations, but they also highlight additional goals for accelerating basic research in the United States and applied research in human-machine teaming in China (National Institutes of Health, n.d.b; Poo et al., Reference Poo, Du, Ip, Xiong, Xu and Tan2016). The dual-use technologies made possible by all seven brain projects are likely to have profound implications for society, public health, and national security. From a national security perspective, it would be valuable to be able to predict the dissemination of neurotechnologies to both the commercial and military sectors. This research presents the first analytical framework that attempts to do so.

Like past scientific and technological breakthroughs, cognitive sciences research is purported to have an impact on future security policies. Since the cognitive sciences focus on humans, they are intricately tied to the study of social processes, including the realms of politics and security. International scientific bodies, including the U.S. National Academy of Sciences and the United Kingdom’s Royal Society, have also engaged in discussions on the field’s policy relevance (National Research Council, 2008, 2009; Royal Society, Reference Society2012). Experts involved in developing NATO’s New Strategic Concept released in 2010 noted that “less predictable is the possibility that research breakthroughs will transform the technological battlefield,” and that “allies and partners should be alert for potentially disruptive developments in such dynamic areas as information and communications technology, cognitive and biological sciences, robotics, and nanotechnology” (NATO, 2010, p. 15; emphasis added). Interest in neurosciences is not limited to any specific nation, and the potential for new technology to affect conflict and cooperation has been recognized.

Security-related research in neuroscience and operationalization of scientific discoveries into commercial or deployable neurotechnologies present concerns for policymakers and scholars. Not only do the effects and use of military applications of cognitive science and neuroscience research require their attention, but also the increasing understanding of cognitive processes continues to provide new perspectives on how we understand policy and politics.

Consideration of the challenges to international security and policymaking as a result of scientific and technological advancements is not novel. Anticipating and responding to potential emerging threats to security and understanding disruptive technologies are intrinsic to the security dilemma. Consideration of the relationship between technology and conflict has a substantial and deep history across the social sciences as a determinant of global power (Turner, Reference Turner1943). The classical realist thinker Hans Morgenthau (Reference Morgenthau1972) critiqued the role of modern technology, technocrats, and the resultant “technological revolutions” in international politics and asserted that a rethinking of traditional international relations theory and government structures was required because of such. Bridging sociology and political science, William Ogburn (Reference Ogburn1949) was one of the first academics to focus on “inventions” or innovation, in the context of origins, diffusion, and effects, and to systematically study the social effects of innovation on international politics. More recent work by scholars such as Susan Strange (Reference Strange1996) has explored how technology and technologically enabled systems have affected concepts of geographically demarcated state sovereignty and autonomy.

The importance and security implications of how well a state is able to translate basic research into applied and commercial technology is another area of research in security studies that is particularly relevant to this work, as it explores how concepts from basic neuroscience can be developed and applied to innovations in a security context (Dombrowski & Gholz, Reference Dombrowski and Gholz2006; Skolnikoff, Reference Skolnikoff1993). One of the most notable examples in the study of technology’s impact on state interactions is the invention of nuclear weapons and the reconfiguration of strategic logic to deterrence (Gaddis, Reference Gaddis1989; Herz, Reference Herz1959; Meyer, Reference Meyer1984; Sagan & Waltz, Reference Sagan and Waltz2012). The mutual assured destruction logic underlying nuclear deterrence constrains a state’s choice of strategy. Within security studies, there is a rich literature theorizing and empirically exploring the intersection of science, technology, and understanding the outcomes of armed conflict (Arquilla, Reference Arquilla2002; Biddle, Reference Biddle2004; Chin, Reference Chin2019; Kosal, Reference Kosal2019; O’Hanlon, Reference O’Hanlon2009; Rosen, Reference Rosen1991; Skolnikoff, Reference Skolnikoff1993; Solingen, Reference Solingen1994).

For strategists and scholars of revolution in military affairs, which focuses on emerging technologies and posits that military technological transformations and the accompanying organizational and doctrinal adaptations can lead to new forms of warfare, the nexus between technology and military affairs bears directly on the propensity for conflict and outcomes of war, as well as the efficacy of security cooperation and coercive statecraft (Bernstein & Libicki, Reference Bernstein and Libicki1998; Blank, Reference Blank1984; Cohen, Reference Cohen1996; Herspring, Reference Herspring1987; Krepinevich, Reference Krepinevich1994; Mahnken, Reference Mahnken2010; McKitrick et al., Reference McKitrick, Blackwell, Littlepage, Kraus, Blanchfield, Hill, Schneider and Grinter1995; Nofi, Reference Nofi2006). These discussions underpin the concept of network-centric warfare, operations that link combatants and military platforms to each other in order to facilitate information sharing as a result of the progress in information technologies (Arquilla, Reference Arquilla2002).

In the late twentieth century, one predominant model for understanding the conditions under which conflict and cooperation are likely and how technology can contribute to increasing or decreasing instability in the international system was the offense-defense model (Jervis, Reference Jervis1978). This theory asserts that more complete information on the intentions of rivals allows both sides to manage a spiraling arms race. Awareness of aggression allows for coalition building and diplomatic action in order to preemptively quell belligerence. An advantage of defensive technology over offensive technology is that it lowers the cost/benefit equation of the attacker and, in the words of Clausewitz (Reference Clausewitz and Greene2013), “tame[s] the elementary impetuosity of War.” Offensive ascendency, conversely, creates a sense of urgency for states to develop greater offensive capabilities and seek out alliances, further increasing tensions. Emerging technologies, such as nanotechnologically enabled meta-materials, biotechnology, and neurotechnology, may problematize offense-defense theory by challenging the distinction between offensive and defensive weapons (Kosal & Stayton, Reference Kosal, Stayton and Kosal2019). This work does not seek to resolve that issue but furthers the scholarly debate about how emerging technologies affect international security.

Another area of scholarly work has considered and problematized the way emerging technologies may challenge existing laws, including the law of armed conflict, international environmental law, and arms control treaties, and need to govern the introduction, implementation, and use of emerging technologies as a means or method of warfare (Leins, Reference Leins2021; Nasu & McLaughlin, Reference Nasu and McLaughlin2014).

This research focuses on analyzing one emerging technology that has been identified as having security significance: the commercial and military adoption of brain-computer interfaces (BCIs), a nascent neurotechnology, by the United States and China. These two nations are the focus because they are among the largest spenders on brain projects, and they are increasingly seen as peer economic and military competitors, with particular recent emphasis on technological competitiveness between the two (Dobbins et al., Reference Dobbins, Shatz and Wyne2018; Ferchen, Reference Ferchen2020; Lewis, Reference Lewis2018; O’Rourke, Reference O’Rourke2020; Rasser, Reference Rasser2020; Wray, Reference Wray2020). The 2018 U.S. National Defense Strategy highlighted “long-term, strategic competition” with China as a top priority in context of great power competition (Campbell & Ratner, Reference Campbell and Ratner2018; Colby & Mitchell, Reference Colby and Mitchell2020 Jones, Reference Jones2020; Tellis, Reference Tellis, Tellis, Szalwinski and Wills2020; Wu, Reference Wu2020). This competition will naturally include vying for “technological advantage,” especially with emerging technologies like those enabled by the brain projects to avoid technological surprise (U.S. Department of Defense 2018). Future work may consider other nations.

Adoption of truly emerging technologies, like neurotechnologies, in both the commercial and military sectors is likely to have important implications for the strategic competition between nations, making them important case studies for predicting neurotechnology adoption likelihood. In economics, the advantages and disadvantages of being a “first mover” or a “fast follower” in new markets have been discussed (Kerin et al., Reference Kerin, Varadarajan and Peterson1992; Wunker, Reference Wunker2012). Most of these studies highlight very large advantages for first movers in emerging industries, while also acknowledging that fast followers occasionally benefit by learning from any missteps that a first mover makes or by capitalizing on the high development costs that first movers incur. This terminology has also been applied to nations’ strategies toward emerging technologies. Notably, the U.S. National Security Commission on Artificial Intelligence highlighted in its 2021 report “the first-mover advantage of developing and deploying technologies like microelectronics, biotechnology, and quantum computing” (National Security Commission on Artificial Intelligence, 2021). Being a first mover would enable a nation’s private industry to capitalize on a growing market for neurotechnologies. It may also allow that first-mover nation to play a lead role in setting the ethical and legal norms internationally for these devices. Therefore, being able to anticipate which nation is most likely to be a first mover and see widespread technology development and dissemination is crucial to understanding the national and international security landscape. Additionally, the pursuit of new capabilities, including first acquisition and deployment, has been studied as part of the arms race literature (Evangelista, Reference Evangelista1989; Gray, Reference Gray1974; Hundley, Reference Hundley1999).

This research looks at public funding for neuroscience research and development (R&D) through the U.S. BRAIN Initiative and the China Brain Project. Though other sources of funding for neuroscience are present in both nations, we chose to analyze these brain projects because of their importance as an articulation of national strategy for neuroscience research that could enable earlier adoption. Their stated goals and directed funding, which involves stakeholders from government, academia, military, and industry, can be viewed as a cohesive national strategy for identifying top-priority areas of neuroscience research and determining how findings from this research will be translated into new technologies. Each project also has components to enable translational research—that is, moving from basic research through applied R&D. Additionally, the brain projects serve as a diplomatic interface for international collaboration by fostering data inventorying and sharing, as well as promoting consensus on international norms for ethical use of these technologies (International Brain Initiative, n.d.).

Focusing on one emerging neurotechnology, instead of a broad investigation of many neurotechnologies, enables more robust analysis of the sociocultural, governmental, and economic influences that could drive adoption of a specific dual-use technology. BCIs are a dual-use technology directly enabled by brain project funding that both the United States and China have a stated interest in developing for commercial and military applications. These devices have the potential for high adoption by healthy people for both civilian and military purposes, and they are already available on the market (Emondi, Reference Emondin.d.; Farnsworth, Reference Farnsworth2017). Additionally, these devices may raise significant or even profound ethical concerns involving data privacy and individual autonomy (Global Neuroethics Summit Delegates et al., Reference Rommelfanger, Jeong, Ema, Fukushi, Kasai and Singh2018; Moreno, Reference Moreno2003). Likely for these reasons, the U.S. Congressional Research Service (2021) identified BCIs as an emerging technology that should be considered for export controls to nations like China. BCIs fit the definition of dual-use technology in two ways: BCIs can be used for both civilian and military purposes, and BCI technologies intended for civilian use could be co-opted for malicious or deleterious misuse.

Two hypotheses are proposed and tested in order to better understand how this emerging technology is likely to be operationalized:

H1: The United States will adopt BCI technologies for commercial and military use before China if national innovation systems, amount of brain project funding, and current BCI market share are more predictive of BCI technology adoption.

H2: China will adopt BCI technologies for commercial and military use before the United States if government structure, brain project and military goals, sociocultural norms, and research monkey resources are more predictive of BCI technology adoption.

Literature review

While other papers have addressed the potential commercial and military applications of BCIs, none has attempted to predict adoption likelihood or compared adoption likelihood of these devices. The ability to make predictions about adoption likelihood in the United States and other countries has been identified as a priority by the U.S. Army Combat Capabilities Development Command (DEVCOM), since so few studies have been conducted to assess attitudes toward BCIs in both the public and military sectors (Emanuel et al., Reference Emanuel, Walper, DiEuliis, Klein, Petro and Giordano2019). These devices, which can connect human and machine intelligence, have the potential to shape society and change the nature of warfare.

In the last decade or so, a small but significant body of literature has emerged on the intersection of advances in the cognitive and neurosciences and conflict across multiple disciplines. A significant portion of the studies and policy literature on cognitive sciences research is concerned with the ethics of such research. The ethical concerns raised largely fall into two areas of debate—(1) human enhancement and (2) thought privacy and autonomy—both of which are relevant to the current security research on cognition. The issue of cognitive enhancement has been a contested area of research (Parens, Reference Parens and Illes2006). While some embrace the potential of “neuropharmaceuticals” and advocate industry’s self-regulation, others have raised concerns about the potential for political inequality and the possible disruption of natural physiological processes (Fukuyama, Reference Fukuyama2003; Gazzaniga, Reference Gazzaniga2005; Hitchens, Reference Hitchens2021; Naam, Reference Naam2005). The issue of thought privacy often emanates from advancements in noninvasive imaging and stimulation techniques used for neurological research. Many of the concerns relate to how such advancements could be used for lie detection and interrogation, including applications for domestic or foreign intelligence (Canli et al., Reference Canli, Brandon, Casebeer, Crowley, DuRousseau, Greely and Pascual-Leone2007; Wild, Reference Wild2005).

These ethical discussions have also extended to military R&D in the cognitive sciences. What role, if any, neuroscience research should play in national security and how its impact should be understood have been hotly debated (Evans, Reference Evans2021; Krishnan, Reference Krishnan2018; Marcus, Reference Marcus2002; Moreno, Reference Moreno2006; Munyon, Reference Munyon2018; Tracey & Flower, Reference Tracey and Flower2014). Some have advocated against the inclusion and use of neuroscience techniques for national security purposes, while others justify the defense and intelligence community’s involvement in consideration of maintaining the superpower status of the United States (Giordano et al., Reference Giordano, Forsythe and Olds2010; Rippon & Senior, Reference Rippon and Senior2010; Rosenberg & Gehrie, Reference Rosenberg and Gehrie2007). Still, some have contended that neuroethics must be considered in national security discussions, while others have advocated that the security use of neuroscience research is best framed as a consideration for human rights (Justo & Erazun, Reference Justo and Erazun2007; Lunstroth & Goldman, Reference Lunstroth and Goldman2007; Marks, Reference Marks2010). In discussing classified research on brain imaging, an open dialogue between scientists and government officials has been called for (Resnik, Reference Resnik2007). The place and role of neurosciences research and neurotechnology in security policy remain contested, and neither a security nor an ethical framework acceptable to all parties under which such research can be analyzed exists.

Another major area of social science work, often empirical and positivist in nature (rather than normative), can be found in research that approaches neuroscience and neurotechnology from a technology survey or qualitative case study method to probe its implications, including understanding the national security implications and approaches to reducing risk (Binnendijk et al., Reference Binnendijk, Marler and Bartels2020; DeFrancoet al., Reference DeFranco, DiEuliis and Giordano2019; Huang & Kosal, Reference Huang and Kosal2008; Rachna & Agrawal, Reference Rachna and Agrawal2018; Royal Society, Reference Society2012). Some such studies are more descriptive and others are more systematic, depending on the researcher and the scholarly discipline, ranging from political science and international relations to public policy and science and technology studies. Many, but not all, consider neurotechnologies in the context of chemical and biological weapons and their respective international legal bodies and biosecurity policy (Dando, Reference Dando and Rappert2007; DiEuliis & Giordano, Reference DiEuliis and Giordano2017; Nixdorff et al., Reference Nixdorff, Borisova, Komisarenko and Dando2018).

In addition to the earlier discussion on the theoretical approaches to the relationships between technology and conflict more generally, there are specific theoretical questions that apply to the realm of neuroscience in the context of military technology, strategic decision-making, and conflict. How advances in the cognitive neurosciences may undermine rational actor theory, a core component of nuclear deterrence theory, has been theorized (Stein, Reference Stein, Paul and Morgan2009). Other work has considered how scientists and neuroscientists, the very people whose research is in the spotlight, understand and view the security risks and potential consequences of such research and its implications for international security and governance (Kosal & Huang, Reference Kosal and Huang2015). This work adds to that systematic body of literature in the social sciences.

Technical background

BCIs can be categorized by their capabilities and by their applications (see Figure 1). BCIs can be distinguished in their capabilities by whether they can “read” brain activity, “write” brain activity, or both, and by whether they are invasive or noninvasive. Three applications are discussed here: brain-computer, brain-computer-device, and brain-computer-brain. Each of these applications can be invasive or noninvasive. While read functions are all that is necessary for brain-computer-device and brain-computer technologies, write functions are necessary for brain-computer-brain technologies.

Figure 1. Three dual-use applications of BCIs (brain-computer, brain-computer-device, and brain-computer-brain technologies) can be categorized by their level of invasiveness and their functional ability to read or write brain activity. The symbols for each technology are shown in the category that corresponds to the theoretical minimum functionality and invasiveness required to use them, while the shaded areas demonstrate the other categories these technologies fall under. Greater invasiveness generally leads to greater data fidelity/interpretability. The ability to write brain activity noninvasively is an active area of research.



The ability to read brain activity involves being able to capture the electrical activity of the brain and interpret the information contained in that electrical activity. Capturing the electrical activity of the brain involves electrodes and can occur on different spatial (centimeters to micrometers) and temporal (seconds to sub-milliseconds) scales (Nicolas-Alonso & Gomez-Gil, Reference Nicolas-Alonso and Gomez-Gil2012). Electrically active cells in the brain called neurons are the fundamental unit of brain computation; they produce action potentials, or “spikes,” in electrical activity, which carry information (Gerstner et al., Reference Gerstner, Kreiter, Markram and Herz1997). Electrical activity is interpreted using some form of model or decoder to understand the information carried, such as recurrent neural networks that can predict intended movements from neural activity (Pandarinath et al., Reference Pandarinath, Ames, Russo, Farschchian, Miller, Dyer and Kao2018). Different regions of the brain serve known functions, so placing a device in specific brain regions obtains different types of information. For example, the motor cortex is a region of the brain that contains information about intended movements, and most cortical neuroprosthetic devices record from this region (Bensmaia & Miller, Reference Bensmaia and Miller2014).

The ability to write brain activity involves evoking electrical activity in neurons. This can be done with a variety of techniques, including magnetic, optical, and electrical stimulation paradigms (DARPA, 2019). Most techniques currently used in research and clinical settings involve electrical stimulation via electrodes (Lozano et al., Reference Lozano, Lipsman, Bergman, Brown, Chabardes, Chang and Krauss2019). Electrodes are bidirectional in that they serve as an electrical contact between the brain and computer. Electrodes can both read activity by recording neurons and write activity by generating an electrical current that stimulates the neurons to produce spikes. Because we have limited principled understanding of how the brain structures information in single neurons and circuits of connected neurons, writing activity that translates into the desired outcomes of the stimulation is much more difficult than reading and interpreting activity, which can be accomplished using data-driven approaches (Jonas & Kording, Reference Jonas and Kording2017).

Level of invasiveness

BCIs can be either invasive or noninvasive. Invasive devices require surgery to implant beneath the skull (intracranially). They utilize electrodes to be able to read and write brain activity at finer scales than noninvasive devices (Ramadan & Vasilakos, Reference Ramadan and Vasilakos2017). Invasive BCI devices include the following:

  • Single-unit recordings obtained from micro-electrodes read spikes from many single neurons simultaneously. These devices typically use the spike rate (the number of spikes in a given window of time) of many related neurons to determine meaningful information on what the brain is computing. To obtain recordings from single neurons, the electrodes used for these devices must penetrate the cortex. These devices are considered the best for obtaining high information quality, but their weakness is a lack of long-term viability due to degradation of the recording quality over time via neural scarring (Salatino et al., Reference Salatino, Ludwig, Kozai and Purcell2017).

  • Depth local field potential (LFP) recordings are also obtained using larger electrodes. Instead of reading spikes, they capture the sum of electrical activity from many neurons on a coarser temporal scale. These devices also penetrate the cortex, but they do not require single-unit recordings. One of the benefits of these devices is that they are less prone to day-to-day variability in the quality of recordings, like single-unit recordings (Heldman & Moran, Reference Heldman, Moran, Ramsey and Millán2020). However, their invasiveness still makes them prone to degradation around the recording site.

  • Electrocorticography recordings are obtained from the surface of the brain or the cortex and capture broad electrical activity from many neurons at once. In contrast with single-unit recordings, these do not capture individual spikes from single neurons or depth LFPs, but rather the LFPs generated at the surface of the brain, with a temporal resolution too coarse to capture individual spikes. The spatial resolution of these devices is also coarser grain but can capture a larger region of the brain. Finally, they are less invasive than single-unit recordings or depth LFPs because they do not penetrate the cortex, and they may be more stable over longer periods (Sauter-Starace et al., Reference Sauter-Starace, Cretallaz, Foerster, Lambert, Gaude and Torres-Martinez2019). These electrodes can also be used to write brain activity and are currently used for this purpose in clinical settings (Caldwell et al., Reference Caldwell, Ojemann and Rao2019).

Currently, only noninvasive devices are available commercially for nonmedical purposes. These devices sacrifice information quality for ease of use. They are packaged in the form of electroencephalography (EEG) headsets with a few large electrodes that record information from the surface of the scalp. They require no surgery and can be removed easily. However, they are incredibly noisy, with signals that are corrupted by head movements and eye blinks, making it difficult to read brain activity. Despite these shortcomings, EEG is also used in clinical and therapeutic settings for important procedures such as diagnosing epilepsy (Tatum et al., Reference Tatum, Rubboli, Kaplan, Mirsatari, Radhakrishnan, Gloss and Beniczky2018). Medical EEG devices tend to have better data quality than commercial EEG devices (Ratti et al., Reference Ratti, Waninger, Berka, Ruffini and Verma2017). Additionally, EEG headsets available on the market do not currently provide the ability to write brain activity. The ability to write brain activity with a noninvasive device remains an active area of research (Polania et al., Reference Polania, Nitsche and Ruff2018). These noninvasive write devices could include the use of acoustic, optical, and electromagnetic techniques that induce electrical activity in localized areas of the brain from outside the skull (DARPA, 2019).


To begin a discussion of the applications of BCI technologies described earlier, we present two illustrative scenarios: one that considers how a relevant past military engagement may have been different if BCIs had been utilized, and another that considers a relevant future scenario in which the dual-use properties of BCIs could cause a security incident. Construction and presentation of formalized scenarios are contemporary tools widely utilized in the military, business, government policy planning processes, and international relations (Barma et al., Reference Barma, Durbin, Lorber and Whitlark2016; Bishop e tal., Reference Bishop, Hines and Collins2007 Schwartz, Reference Schwartz1996). Scenarios are not intended to be predictive but are used to illustrate potential futures for planning or other analytical purposes.

Scenario #1

In the mid-2000s, BCIs are already a fully developed, deployable technology used by military personnel. There is a major offensive against a densely populated urban city controlled by insurgents. The goal of the operation is to take control of the city from the insurgents while minimizing civilian casualties. Military units with enhanced personnel lead an assault on the city. Personnel with visual enhancement BCIs discriminate between insurgents and civilians using artificial intelligence (AI) identification that presents an overlayed sensory indicator on their visual field. Other personnel use auditory-enhancement BCIs that allow them to receive real-time translations of the languages being used by insurgents and civilians, providing actionable intelligence. The insurgents left improvised explosive devices and other incendiary traps at key locations in the city. Personnel with BCIs defuse devices using an extremely dexterous arm robot that can be controlled remotely, avoiding the potential for setting off the device near the unit or location detection by the insurgents, while also allowing for simultaneous control of a firearm. Finally, these enhanced units can communicate telepathically with each other via brain-to-brain communication enabled by BCIs, relaying important battlefield information silently. These enhanced units can do reconnaissance, clear deadly devices from the streets, and effectively minimize civilian casualties. This results in fewer personnel casualties during the overall offensive.

Scenario #2

The year is 2040. BCIs are widely used by both civilians and military personnel for routine computing and control tasks in developed nations. BCIs are used for piloting drones by air forces, providing a competitive advantage in reaction time and sensory processing speed over nations that use manual controls. These BCIs are noninvasive and have strict cybersecurity and data privacy protocols to ensure they are not prone to a cyberattack. Additionally, invasive BCIs used in a civilian context can control devices as well as alter mood and focus, but they have less security measures in place than military BCIs. Two great power competitors are involved in a territorial dispute over an island nation with rare earth metal deposits and an advanced semiconductor manufacturer. Nation A has a military base on and treaty with the island nation, while Nation B seeks regional hegemony and a greater sphere of influence through control of the island. Drones routinely engage in air-to-air combat over air space disputes around the island, potentially endangering Nation A’s crewed planes carrying personnel and supplies to the island. Nation A has more advanced BCIs and is able to protect its crewed planes and cause significant financial loss to Nation B by destroying its drones in air-to-air combat. However, during a recent combat, Nation A’s drone pilots report feeling nausea, drowsiness, lack of attention, and heightened stress. The mental state of the pilots causes significant loss of drones and the downing of a crewed plane, sparking an international incident, since air force casualties in combat are now a rarity. It is later discovered that Nation B exploited the weaker security protocols of civilian BCI devices used by Nation A’s military personnel and conducted a targeted attack on the drone pilots’ mental state and mood during the air-to-air engagement through these devices.

Applications of BCIs

BCI devices can be used for many different purposes, but they fall into three broad categories of applications that are illustrated in these two scenarios:

The potential civilian and military applications of BCI for human enhancement will affect the international security environment. A more thorough treatment of the security implications of dual-use BCIs can be found in DEVCOM’s Cyborg Soldier 2050 study; in brief, these technologies can serve as both measures and countermeasures in military contexts, as well as complicate the national security landscape in civilian contexts (Emanuel et al., Reference Emanuel, Walper, DiEuliis, Klein, Petro and Giordano2019). BCIs could be used by military personnel to enhance visual processing, reaction time, attention, and mental states, all of which would provide a significant cognitive advantage over competitors without BCIs or with less advanced BCIs. However, BCIs are also vulnerable to countermeasures like cyberattacks, raising concerns about data privacy or even the ability to attack the brain through BCIs with write capabilities. BCIs used in civilian contexts are also vulnerable to countermeasures, which lead to broader security implications than even their military use entails.

Additionally, BCIa have proposed applications within the intelligence community. For example, a BCI that can read brain activity could be utilized to detect whether a person is being deceptive during interrogation, while also monitoring other brain states like attention and stress (Gherman & Zander, Reference Gherman and Zander2021; Lin et al., Reference Lin, Sai and Yuan2018). Proponents of this use say that this would provide a unique advantage over other deception detection methods like the polygraph in both accuracy and multifunctionality. Defensive counterintelligence measures would also be necessary if BCIs are utilized in military and intelligence operations, or if foreign intelligence services seek to attack BCI vulnerabilities in the manner described earlier.


Independent variables associated with BCI technology adoption were identified that may indicate whether the United States or China will adopt BCI technologies first. These variables address separate factors that affect adoption of emerging technologies (Bussell, Reference Bussell2011; Corrales & Westhoff, Reference Corrales and Westhoff2006; Milner, Reference Milner2006; Rotolo et al., Reference Rotolo, Hicks and Martin2015; Schmid & Huang, Reference Schmid and Huang2017; Taylor, Reference Taylor2016): (1) the technological capacity of a state and (2) cultural and social characteristics.

Most variables examined here can be measured quantitatively, though important qualitative variables include government structure, national innovation systems, and stated brain project and military goals. One natural consequence of attempting to understand a future event is that a model of how indicators map onto the dependent variable of widespread BCI adoption technology cannot be constructed based on data. To address the question of individuals’ likelihood of using a technology, previous studies have mapped sociocultural scores and other indicators onto technology adoption likelihood, but these studies investigated other technologies such as mobile phones and IT technologies that have some differences from BCIs and other neurotechnologies (LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017; Lee et al., Reference Lee, Trimi and Kim2013).

To support a hypothesis of earlier adoption in either the United States or China based on the identified indicators, reports of BCI use in both commercial and military settings in the United States and China were identified as early proxies for adoption likelihood. Therefore, the methodology of this article can be divided into (1) developing hypotheses that would support earlier adoption in either the United States or China and (2) using the real-world examples of BCI use as a dependent variable to determine which hypothesis is more likely and therefore which indicators are more important for BCI technology adoption. In this section, definitions are provided for each of the indicators of BCI adoption identified and analyzed. Why indicators are important factors to consider and why these variables were included whereas others were not is also discussed.

If government structure, sociocultural norms, and stated goals are more predictive of BCI adoption, China will be the first adopter of BCIs in both commercial and military contexts. However, if BCI adoption is predicted by public funding and market share, then the United States will be the first adopter of BCIs. Here, we support the hypothesis that China will be the first widespread adopter of BCIs.

Qualitative variables

Government structure

Government structure, whether democratic-republic, autocratic rule, or another form, can affect BCI adoption because of its effect on the ties between the military, industry, academia, and government. Government structure can also strongly affect how new technologies are implemented for defense purposes.

National innovation systems

National innovation systems have been defined as “the network of institutions in the public and private sectors whose activities and interactions initiate, import, modify, and diffuse new technologies” (Freeman, Reference Freeman1987 p. 17). This definition emphasizes the linkages between the many important institutions within a nation that drive innovation and technology development. This is an important indicator of BCI adoption because of the way it affects support for R&D and companies that will market BCIs. While government structure can strongly affect national innovation systems, it is not the only determining factor. Additionally, government’s role in technology adoption can be separate from its role in technology innovation. Therefore, government structure and national innovation systems are considered separately here. A previous analysis of the national innovation systems in the United States and China by Melaas and Zhang (Reference Melaas and Zhang2016) is used here as an indicator of BCI technology adoption.

Brain initiative and military goals

This study emphasizes government investment in the major brain research projects rather than investment in general or other neuroscience research, because the goals of the U.S. BRAIN Initiative and the China Brain Project are a coherent articulation of a national strategy for neuroscience research that involves all major stakeholders in government, academia, military, and industry. The overall goals of these brain projects are related to the way they affect BCI research and adoption, and we look for specific mentions of BCI technologies. In addition to the emphasis on brain project rhetoric, military documents from both countries were reviewed to provide additional insight into perceived military use cases for BCIs.

Quantitative variables

Sociocultural norms

Cultural and sociocultural norms can be treated as quantitative variables. Various frameworks have been proposed to treat sociocultural norms quantitatively. One of the most widely used is Hofstede’s cultural dimensions (see Table 1 and Figure 2, adapted from LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017) (Hofstede & Bond, Reference Hofstede and Bond1984; LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017). This framework gives a quantitative approach to describing culture and is based on surveys of IBM employees and other populations (Hofstede et al., Reference Hofstede, Hofstede and Minkov2010). Additionally, these variables have been used by others to predict and describe technology adoption of information systems, mobile devices, and big data analytics (LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017; Lee et al., Reference Lee, Trimi and Kim2013; Srite, Reference Srite2006).

Table 1. Hofstede’s cultural dimensions.

Figure 2. A comparison of Hofstede’s cultural dimensions for the United States and China; adapted from LaBrie et al. (Reference LaBrie, Steinke, Li and Cazier2017).

Brain initiative funding

The budgets for both the United States’ and China’s brain projects are available online. While this variable does not include breakdowns for money specifically targeted at BCI development between the two countries, it gives a general sense of the level of investment in neuroscience and neurotechnologies. Additionally, this variable quantifies the actual investment of each nation to achieve its stated goals and national strategy for neuroscience research. In the context of the stated goals for each brain project, this variable demonstrates whether a gap exists between rhetoric and real investment.

Access to data on the budget of the U.S. BRAIN Initiative and the projects it has funded was much more open than data on the China Brain Project. One of the difficulties was an inability to compare funding within these projects specifically dedicated to BCIs. While these data were available for the U.S. BRAIN Initiative, they were not available for the China Brain Project (Dimensions, 2021).

Funding for neuroscience research outside the brain projects is not included here as another variable. The brain projects are an articulation of a national strategy for neuroscience research and are meant to propel the neurotechnology sector of both nations far more than disparate research funding through other sources. While military funding for neuroscience research has been significant in the United States, estimated in the hundreds of millions of dollars across multiple funders, including the Defense Advanced Research Projects Agency (DARPA) and the military service branches, it is still believed that most military adoption of BCIs will occur after, not prior to, civilian adoption, which would be supported more by the clinical and commercial neurotechnologies whose development is made possible through the brain projects (Emanuel et al., Reference Emanuel, Walper, DiEuliis, Klein, Petro and Giordano2019; Kosal & Huang, Reference Kosal and Huang2015).

Number of patents

The number of BCI patents filed in each country since 2010 was determined using Google Patents, specifically only including patents with the keyword “brain-computer interface.” Patents are one way to measure innovation in an industry, but differences in the requirements for patent filing in different countries can complicate interpretations of this variable (Shambaugh et al., Reference Shambaugh, Nunn and Portman2017).

Number of research monkeys

Access to research monkeys is an important and novel variable for predicting BCI adoption because nonhuman primates are used extensively in BCI R&D. While noninvasive devices like EEG headsets with read-only capabilities can skip animal trials and proceed to human trials, any device that is invasive or that tests write capabilities must use an animal model (Li & Zhao, Reference Li and Zhao2019). Additionally, one of the lead scientists of the China Brain Project has placed a strong emphasis on developing research monkey colonies (Poo, Reference Poo2016).

Access to and use of research monkeys was assessed using statistics from the U.S. Department of Agriculture, the Chinese Experimental Monkey Breeding Association, and the European Commission on the number of monkeys in colonies, the number used in biomedical research, and the number imported for research purposes. It was difficult to find standard ways of assessing the number of monkeys used for research and the number of monkeys used specifically for BCI research, but the numbers reported give vital information about the supply of this necessary resource.

BCI market share

Market research was used to obtain information about BCI market share between North America and the Asia-Pacific region (Grand View Research, 2020). Detailed information regarding China was not available publicly.

Results and discussion

H1: The United States will adopt BCI technologies for commercial and military use before China if national innovation systems, amount of brain project funding, and current BCI market share are more predictive of BCI technology adoption

According to Melaas and Zhang (Reference Melaas and Zhang2016), the United States and China share some similarities in their national innovation systems, but the United States has a greater capacity to support the basic R&D necessary to generate new technologies like BCIs. The United States’ national innovation system is more “fully integrated,” but also more decentralized than the Chinese national innovation system. Notably, China is a transition economy without “mature private capital markets”; therefore, R&D for technologies is mostly funded through the public sector. This results in a strategy in which China has sought to import technologies developed abroad and focused on developing the manufacturing capability to produce them cheaply. These characteristics support the hypothesis that the United States will be the first to develop marketable BCI technologies, though China may later co-opt these technologies, which is supported by current market share reports discussed later.

The amount of projected funding for the U.S. BRAIN Initiative is much larger than the projected funding for the China Brain Project (Table 2). This demonstrates a larger financial commitment by the government toward basic R&D necessary for BCI neurotechnologies. This again reflects a difference in the robustness of public funding for basic R&D that is demonstrated by the two countries’ national innovation systems. The amount of funding for either project may change. To date, the U.S. government has funded US$1.9 billion in research grants under the U.S. BRAIN Initiative, which aligns with the budget proposed when the initiative was first conceived (Dimensions, 2021; National Institutes of Health, 2014). Approximately US$21.2 million of these research grants are directly related to BCIs or brain-machine interfaces (BMIs). Obtaining data on how much has been spent to date on the China Brain Project is more difficult. Therefore, a direct comparison of research grants for BCI or BMI technologies under the U.S. BRAIN Initiative and China Brain Project cannot be made with detailed precision. In 2018, the China Institute for Brain Research was officially opened with plans to support 150 principal investigators as one of the “first concrete developments” of the China Brain Project (Cyranoski, Reference Cyranoski2018).

Table 2. Summary of qualitative and quantitative indicators of BCI adoption.

The U.S. BRAIN Initiative’s budget of US$6 billion during its lifetime represents only 5% of the NIH’s total budget for brain-related research (National Institutes of Health, 2014). Additional funding for brain research in the United States comes directly from defense agencies and military services. It was difficult to find numbers to indicate China’s total investment in brain-related research, but the US$1 billion expected to be invested in the China Brain Project is still considerable. Additionally, China is likely still in the ramp-up stage of its project. Whereas the U.S. BRAIN Initiative started in 2013, the China Brain Project was first formalized in 2016, three years later. Other sources of funding like private investment are also not considered here but are likely to be important.

North America currently has the largest market share for BCIs. Around 40% of the total revenue from BCI technologies globally was generated in North America in 2019 (Grand View Research, 2020). Most of this revenue was generated by medical applications of BCI technologies, but significant portions of the market were driven by commercial and military technologies. When combined, commercial and military BCI technologies accounted for more than half the global market share, surpassing medical BCI technologies. However, in the next decade, market research suggests that the BCI market will see the most growth in the Asia-Pacific region because of “low-cost manufacturing sites and favorable taxation policies” (Grand View Research, 2020).

The current large market share held by North America is indicative of a robust private sector that will aid commercial adoption of BCI technologies. Many of the products marketed by the top companies cited in the market report are noninvasive, read-only technologies. Acceptance and adoption of these technologies will likely occur before the marketing and adoption of technologies that allow write capabilities or brain-to-brain communication because of a continued need for R&D of these more complex technologies and the need to overcome general distrust of BCI technologies (Emanuel et al., Reference Emanuel, Walper, DiEuliis, Klein, Petro and Giordano2019; Google, n.d.).

Surprisingly, although the United States has better R&D capabilities and North America has a larger market share, the numbers of patents related to BCI technologies filed in the United States and China since 2010 are remarkably similar, with a gap of approximately 5,000 patents favoring the United States (Table 2). However, it is difficult to assess the quality of these patents in the two countries. It is possible that better-quality patents are filed in the United States since there is better support for R&D there and a stronger market for BCI in North America. Additionally, of the top five BCI companies identified by market research, four are headquartered in the United States. The only top BCI company not headquartered in the United States is in Australia (Grand View Research, 2020). None of these key players are in China, though some Chinese state-owned enterprises have developed BCI technologies for general public use, such as the “Brain Talker,” which is a computer chip designed for reading brain activity to interpret “mental intent” (Yin, Reference Yin2019).

It is important to note that the numbers of patents filed in the United States and China are considerably higher than in other nations with brain projects. The EU’s patent office has the third-highest number of patents filed for any international actor with a brain project, and the number is nearly a third of that filed in the United States or China (27,720 patents) (Google, n.d.). The EU is projected to spend US$1.2 billion for its Human Brain Project, which is similar to China’s projected spending, but China far outpaces the EU in the number of patents filed (Global Neuroethics Summit Delegates et al., Reference Rommelfanger, Jeong, Ema, Fukushi, Kasai and Singh2018).

The indicators that favor the United States’ earlier adoption of BCIs over China—national innovation systems, brain project funding, and BCI market share—all address the R&D capacity of both nations. Because the United States has a more robust national innovation system that supports the generation of new technologies, higher public brain project funding, and a greater share of the current market, the United States could generate BCI technologies faster than China and market them domestically, facilitating widespread adoption first in the commercial and then in the military sector. One important caveat to note here is that China is beginning to rapidly ramp up its public sector R&D funding to better compete with the United States (Hourihan, Reference Hourihan2020). While the United States has the advantage currently, it is possible this advantage may begin to erode in the next decade.

H2: China will adopt BCI technologies for commercial and military use before the United States if government structure, brain project and military goals, sociocultural norms, and research monkey resources are more predictive of BCI technology adoption

The United States and China are obvious foils in their government structure, with one being a federal republic and the other being a Communist party-led state. One consequence of this is that China benefits from a continuity of objectives both in statecraft and national defense since power does not change hands with elections. Additionally, China blurs the lines between its civilian and military sectors, which could eventually allow for faster defense acquisition of dual-use technologies (U.S. Department of Defense, 2020). It is also easier for China to mandate the research, development, and adoption of technologies. The U.S. BRAIN Initiative receives funding from and works in partnership with national security-focused agencies like DARPA and the Intelligence Advanced Research Projects Agency (IARPA), while the initiation of the China Brain Project was also closely tied to their overall government’s “Five-Year Plan” for 2016–2020 (Central Compliation and Translation Beureau, 2016).

The China Brain Project’s stated goals place a greater emphasis on brain-machine technologies like BCI than the U.S. BRAIN Initiative. The U.S. BRAIN Initiative’s seven major goals only relate to understanding the brain and improving treatment of brain disorders, and they are focused on developing technologies that enable basic research and clinical applications (National Institutes of Health, 2014). The China Brain Project’s structure is envisioned as “one body two wings,” with a core body of understanding the brain, with an equal emphasis on the applications—the two wings—of treating brain disorders and developing brain-machine intelligence technologies (Poo et al., Reference Poo, Du, Ip, Xiong, Xu and Tan2016).

The China Brain Project puts an equal emphasis on clinical and nonclinical applications of brain research, and it specifically emphasizes integrating brain and machine intelligence much more than the messaging from the U.S. BRAIN Initiative. Dr. Mu-Ming Poo, a leading scientist of the China Brain Project, has written about his belief that a better understanding of the brain will revolutionize AI technologies and his expectation that China will accelerate “development of next-generation AI with human-like intelligence and brain-machine interface technology” (Poo, Reference Poo2018). This stance favors BCI technology dissemination and adoption in China because the China Brain Project has placed a much greater emphasis on BCIs as a top priority.

China also exhibits greater alignment between the stated goals of its brain project and stated military goals. The goals of the U.S. BRAIN Initiative and China Brain Project can be viewed as a high-level articulation of a national strategy for neuroscience research, giving insight into how BCI technologies may be disseminated from clinical research to both commercial and military applications. Rhetoric and stated goals in national defense policy demonstrate whether BCIs are being prioritized specifically for defense or military applications. Greater alignment between the national strategy for brain research as articulated by the brain projects and defense emphasis on BCIs will likely enable quicker BCI adoption in the military sector.

Striking similarities emerge when examining the rhetoric of the Chinese People’s Liberation Army (PLA) and the China Brain Project. The director of the Central Military Commission Science and Technology Commission in China has said “[t]he combination of artificial intelligence and human intelligence can achieve the optimum, and human-machine hybrid intelligence will be the highest form of future intelligence” (Kania, Reference Kania2020). The China Brain Project has identified its two important applications of basic brain research (in its “one body two wings” framework): medical applications for treating brain disorders and brain-machine intelligence technologies like BCIs (Poo et al., Reference Poo, Du, Ip, Xiong, Xu and Tan2016). Both PLA strategists and the heads of the China Brain Project cite artificial intelligence, biological intelligence, and hybrid intelligence as key areas to promote technology development.

This contrasts strongly with the U.S. BRAIN Initiative, in which the high-level goals are limited to basic research and clinical outcomes (translational research) and do not include an emphasis on applications for the commercial and military sectors, though those may be outcomes. The U.S. BRAIN Initiative is also not devoid of ties to the U.S. defense and intelligence communities, since it partners with both DARPA and IARPA (National Institutes of Health, n.d.a). Both the PLA in China and the Department of Defense in the United States have emphasized AI and human-machine teaming as important technologies for future warfare (Binnendijk et al., Reference Binnendijk, Marler and Bartels2020; Kania, Reference Kania2020). The U.S. military has invested in the development of neurotechnologies, and BCIs specifically, through programs at DARPA and through the different military services (DARPA, n.d.; Kosal & Huang, Reference Kosal and Huang2015). However, the lack of goals within the U.S. BRAIN Initiative directly supporting BCIs and commercial applications of neurotechnologies demonstrates less coordination and coherence between a defense strategy for neurotechnology and the U.S. BRAIN Initiative, which represents the most aggressive neuroscience research drive in the United States. A more aligned articulation of technology dissemination goals exists between the China Brain Project and the country’s defense arm—the PLA—in contrast with the mainly medically focused goals of the U.S. BRAIN Initiative.

Differences in sociocultural norms in the United States and China also favor earlier BCI adoption in China because a lesser focus on individualism and social pressures is a large driver of technology adoption. Hofstede’s cultural dimensions have been used to predict technology adoption for other computer technologies, mobile phones, and mobile commerce. While the United States and China have comparable masculinity (MAS) and uncertainty avoidance (UAI) scores, differences are found in their individualism (IDV), power distance (PDI), long-term orientation (LTO), and indulgence (IND) scores (Hofstede Insights, Reference Insightsn.d.). The United States is an individualistic culture, whereas China is a collectivist culture. Additionally, China scores high on acceptance of power differentials in society (PDI) and has a longer-term outlook that makes individuals more adaptable and pragmatic, supporting structures like the government that provide stability (LTO). Finally, individuals in the United States tend to be more indulgent, while individuals in China show more restraint (IND). While IND scores have not been strongly tied to technology adoption trends, differences in IDV, PDI, and LTO scores could drive earlier BCI adoption in China.

One of the potential barriers to the adoption of BCIs identified by U.S. military experts was distrust by service members (Binnendijk et al., Reference Binnendijk, Marler and Bartels2020). In civilian populations, 69% of U.S. respondents to a Pew Research Center survey on human enhancements said they are worried by the idea of BCI technologies, and 66% of those surveyed claimed they would not want to use BCI technologies to enhance their brain (Funk et al., Reference Funk, Kennedy and Sciupac2016). This survey reported similar results for other human enhancements like synthetic blood for improved stamina and gene editing to reduce disease risk. This wariness is predicted by Hofstede’s cultural scores. For mobile devices, it has been shown that during the development of new technologies, cultures similar to Chinese culture (low IDV, high LTO) see more rapid widespread adoption, and cultures like American culture (high IDV, low LTO) will lag behind in adoption (Lee et al., Reference Lee, Trimi and Kim2013). These findings could generalize to other IT technologies, and more broadly to neurotechnologies.

A survey was conducted with American and Chinese respondents to determine their attitudes toward big data technologies, with hypotheses driven by Hofstede’s cultural framework (LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017). While this survey did not address BCI technologies directly, there are some key principles to take away on U.S. attitudes toward new technologies based on their cultural scores. Additionally, BCI technologies will likely use components of big data technologies like machine learning and dimensionality reduction, so they could be considered a nascent big data technology (Frégnac, Reference Frégnac2017). U.S. respondents were less likely than Chinese respondents to approve of technologies that involved data collection from individuals, likely because they place a higher value on individual identity and privacy, as reflected in their high IDV scores in Hosftede’s framework (LaBrie et al., Reference LaBrie, Steinke, Li and Cazier2017). U.S. respondents were also strongly averse to the use of big data analytics by the government, whereas Chinese respondents were mostly favorable to government use. U.S. respondents were only more favorable toward big data analytics use than their Chinese counterparts when data could be anonymized and used by businesses to improve performance. BCI technologies by necessity collect data from individuals and can even affect brain activity, which is highly tied to identity and privacy. High individualism and lower long-term orientation make technologies that collect individual data and technology use by the government more suspect to U.S. respondents. A lower emphasis on individualism could lower barriers to BCI technology adoption in China. Additionally, the high LTO scores of Chinese respondents point to higher acceptance of government use of technologies that collect personal data, like BCI, since high LTO scores have been suggested to translate into greater government support.

U.S. individuals are likely to put a higher value on the perceived usefulness of technologies to determine whether they adopt a technology, while Chinese individuals are more influenced by subjective norms or imitation of peers (Srite, Reference Srite2006). U.S. individuals may perceive BCI technologies as less useful, especially in the current forms available on market. While noninvasive EEG headsets are prone to data errors and poor accuracy, these concerns may be less important to Chinese individuals than American individuals, especially if the technology is promoted by the government, authorities, or peers. These values are reflected in high PDI, high LTO, and low IDV scores.

Hofstede’s cultural dimensions have been used for other technologies to demonstrate that Chinese citizens have fewer reservations about emerging technologies that collect individual data, even when used by the government. In contrast, U.S. individuals are wary of human enhancement technologies like BCIs, and these technologies would need to demonstrate high usefulness to overcome initial distrust of these technologies. A U.S. Army–funded report suggested that media and cultural images of BCI technologies could be altered to include “more accurate depiction of technology and its applications” to change public perceptions and wariness (Emanuel et al., Reference Emanuel, Walper, DiEuliis, Klein, Petro and Giordano2019). In the same report, the authors also suggest that commercial development and dissemination of BCIs will drive the military’s ability to use BCI technologies. Because China can mandate technology use and because its citizens are more likely to approve of BCI technologies, sociocultural norms in China favor earlier adoption of BCIs over the United States.

The final indicator that supports earlier adoption of BCIs in China is that country’s investment in research monkeys. Research monkeys are a vital resource for the development of BCIs of all types, but especially for any invasive BCI technologies. They are robust, novel indicators of whether invasive BCI technologies will be developed for both clinical and nonclinical use, since any BCI that writes to or changes brain activity likely involves R&D using research monkeys (nonhuman primates).

China has a significant material advantage over the United States in the size of its research monkey colonies. They have a larger overall supply of research monkeys which will aid the longevity of their BCI R&D programs. In 2017, the United States used 75,000 monkeys for biomedical research purposes (Grimm, Reference Grimm2018). Around 60% to 80% of those research monkeys were imported from China (Boggan, Reference Boggan2021; Newburger, Reference Newburger2019). This is because U.S. colonies of research monkeys are an order of magnitude smaller than those in China. One of the lead scientists of the China Brain Project, Mu-Ming Poo, reported that there are nearly 300,000 research monkeys in dedicated breeding colonies in China compared to around 40,000 in the United Staes (Animal and Plant Health Inspection Service, 2021; Poo, Reference Poo2016). Monkeys used in research, such as rhesus macaques, are native to China; this fact, combined with a focus on enlarging research monkey colonies in the China Brain Project, gives China a huge advantage over the United States. In fact, in 2020, China stopped exports of research monkeys because of the COVID-19 pandemic (Zhang, Reference Zhang2020). This has hampered some biomedical research efforts, including COVID-19 vaccine trials, in the United States.

The ease of access to research monkeys is one of China’s strategies to drive acquisition of foreign research talent. In contrast to both the United States and China, the EU has severely limited the use of monkeys in research, with only 6,000 monkeys used in 2011 (SCHEER, 2017). The cultural norms driving EU primate research decisions illustrate how nonmaterial ideas can directly impact R&D. China has begun to attract researchers who use monkeys “by offering fully equipped labs with state-of-the-art technology, competitive salaries, ample funding for primate studies, and co-appointments at Chinese institutions for European and American investigators” (Zimmer, Reference Zimmer2018). China’s investment in its research monkey colonies for biomedical research and in attracting research talent demonstrates a long-term investment in using research monkeys for its stated research goals of developing brain-machine intelligence and technologies.

Additionally, China has demonstrated a willingness to limit the supply of research monkeys to the United States. While research monkeys are not necessary for developing noninvasive, read-only BCIs, China’s ability to develop advanced BCI technologies will be enabled by its investment in research monkeys. Research monkeys therefore serve as an indicator of invasive BCI adoption.

BCI adoption indicators that address the broader sociocultural and governmental context necessary to encourage adoption favor an earlier adoption of BCI technologies by China. The collectivist culture of China leads to fewer concerns about data privacy and wider acceptance of government use of technologies like BCIs that collect individual data. The Chinese government has a more coordinated emphasis on developing and disseminating BCI technologies than the United States as well. This includes an emphasis on developing and maintaining research monkey colonies as a vital resource for biotechnology research generally and for testing BCIs before human trials specifically. While the R&D capability of China for developing BCI is not currently on a par with the United States, these factors affecting the likelihood of individuals adopting BCIs (whether elective or mandated) favor earlier adoption in China.

Current BCI use in both nations supports the hypothesis that China will adopt BCI technologies before the United States, supporting H2

Two hypotheses have been presented here supporting the earlier widespread adoption of BCIs in either the United States or China. If BCI adoption is mostly determined by indicators that address R&D capability, the United States will be the first adopter. However, if BCI adoption is mostly be determined by indicators that address government structure and sociocultural norms, China will be the first adopter. To support one of these hypotheses, early indicators of BCI use in commercial and military settings in both countries were identified and assessed.

Currently, BCIs are available on the market for both clinical and commercial applications. However, their use is not widespread in either the United States or China. Here, current BCI use is taken to be a dependent variable, or a proxy, for later widespread BCI adoption. We make the assumption that current levels of BCI use will continue to increase rapidly in countries with higher use rates. This is not a bad assumption and has been used to characterize technology adoption models; namely, early users can have a large effect on technology adoption both by driving greater awareness of the innovativeness of the technology and by promoting imitation among peers (Atkin et al., Reference Atkin, Hunt and Lin2015; Calantone et al., Reference Calantone, Griffith and Goksel2006; Lee et al., Reference Lee, Trimi and Kim2013). Both the innovation and imitation factors are driven by early users, and they are prerequisites for widespread technology adoption.

Early reports of BCI use support the hypothesis that China will be the first adopter of BCI technologies in both the commercial and military sectors. There are media reports of mandatory BCI use by companies in China, whereas there have not been similar reports in the United States. Both the United States and China have had noninvasive EEG headsets with read capabilities used in school settings, usually in pilot studies for devices designed to measure focus and attention (Johnson, Reference Johnson2017; Shen, Reference Shen2019). However, Chinese state-owned companies that run electricity/power plants and train operations have already reported using this same kind of headset to monitor workers’ attention or sleep/awake states (Chen, Reference Chen2018). This application is also advertised by companies operating in the United States, but it is not known whether any commercial entities are using them (EMOTIV, n.d.). While the efficacy of these headsets is potentially low because of the difficulties of interpreting brain activity from EEG, this signals that Chinese state-owned companies are more likely to use these types of devices to monitor workers, potentially driving BCI adoption for commercial purposes (Winick, Reference Winick2018).


Though the United States has spent more money on its brain project, started its brain project earlier, and has a more robust innovation system that has led to better R&D capabilities in both the private and public sectors, China is a more likely to be the first adopter of BCI technologies in both the commercial and military sectors because of its government structure, sociocultural norms, and greater alignment of brain project goals with military goals. Its coordinated national focus has driven investment in brain-machine intelligence and BCI technologies. While the U.S. military has also indicated high interest in human-machine teaming and the United States has invested a large sum of money in brain-related research, there is a disconnect between the stated military goals for brain research and the basic research goals driving the U.S. BRAIN Initiative. Additionally, the cultural values of the United States, including an emphasis on individual identity and a distrust of new technologies that have not proven their usefulness, will hinder BCI adoption both commercially and militarily.

China’s early adoption of BCIs could have important implications for U.S. national security. These truly emerging technologies, though nascent, may give China a military advantage through cognitive enhancement of warfighters and improved human-machine teaming when matured. Additionally, China’s native supply of research monkeys, coupled with lack of ideational drive away from primate research, will lower barriers to developing commercially and militarily viable invasive BCIs that can read and write brain activity with better accuracy and precision.

Security risks also exist for commercial applications of BCIs. If China is the most viable place to produce and market BCI technologies, it may have a large influence on the supply of BCIs that will eventually be used in the United States and other countries. This could lead to privacy issues for very personal data—the activity of a human brain and interpretations of that activity that give insight into mental state and mood. Finally, being a first mover on BCI technology may allow China to set ethical norms for BCI use. Understanding the brain to treat disease is a noble cause and should be pursued. However, the technologies enabled by this heavy investment in brain research have clear dual-use capabilities that will shape society and warfare.

The effects described here for the specific relationship between China and the United States reflect the broader reality of the impact of BCIs and neurotechnologies generally on the international security landscape. Early innovators and adopters of BCIs may have the opportunity to set international norms for their use in both civilian and military contexts for human enhancement. BCIs have both offensive and defensive capabilities in military contexts, while also possessing clear clinical and therapeutic uses that will promote social good, complicating their categorization and treatment in the international community. Existing international treaties or conventions on weapons do not cover neurotechnologies, and it is unclear whether existing conventions could neatly and efficiently do so. It will be necessary for both nations and the international community at large to grapple with the ethical, legal, and social implications of BCIs as they begin to see widespread use by civilians and military personnel.

Finally, it is important to note that in the specific case of China, we have much more limited access to information about research spending and outputs compared to the United States. Additionally, data and reports from authoritarian regimes are known to be unreliable (Ahram & Goode, Reference Ahram and Goode2016). For example, we were unable to find information on the amount of grant funding associated with the China Brain Project (Dimensions, 2021). We support our hypothesis of China’s earlier adoption of BCIs with this potentially limited information, but it is possible that further information on China’s research and policy goals, as well as its actual implementation of those goals, would change our conclusions. More broadly, this analytical framework can be applied to other authoritarian regimes, as long as healthy skepticism is used when scrutinizing information from these regimes.


Adewole, D. O., Serruya, M. D., Harris, J. P., Burrell, J. C., Petrov, D., Chen, H. I., … Cullen, D. K. (2017). The evolution of neuroprosthetic interfaces. Critical Reviews in Biomedical Engineering, 44(1–2), 123152.CrossRefGoogle Scholar
Ahram, A. I., & Goode, J. P. (2016). Researching authoritarianism in the discipline of democracy. Social Science Quarterly, 97(4), 834849.CrossRefGoogle Scholar
Animal and Plant Health Inspection Service. (2021). 2019 annual report: Animal usage by fiscal year—Column B. U.S. Department of Agriculture. Retrieved May 3, 2021, from Google Scholar
Arquilla, J. (2002). Networks and netwars: The future of terror, crime, and militancy. RAND Corporation. Google Scholar
Atkin, D. J., Hunt, D. S., & Lin, C. A. (2015). Diffusion theory in the new media environment: Toward an integrated technology adoption model. Mass Communication and Society, 18(5), 623650.CrossRefGoogle Scholar
Barma, N. H., Durbin, B., Lorber, E., & Whitlark, R. E. (2016). “Imagine a world in which”: Using scenarios in political science. International Studies Perspectives, 17(2), 117135.Google Scholar
Bensmaia, S. J., & Miller, L. E. (2014). Restoring sensorimotor function through intracortical interfaces: Progress and looming challenges. Nature Reviews Neuroscience, 15(5), 313325.CrossRefGoogle ScholarPubMed
Bernstein, A., & Libicki, M. (1998). High-tech: The future face of war? A debate. Commentary, 105(1), 2831.Google Scholar
Biddle, S. (2004). Military power: Explaining victory and defeat in modern battle. Princeton University Press.CrossRefGoogle Scholar
Binnendijk, A., Marler, T., & Bartels, E. M. (2020). Brain-computer interfaces: U.S. military applications and implications, an initial assessment. RAND Corporation. Google Scholar
Bishop, P., Hines, A., & Collins, T. (2007). The current state of scenario development: An overview of techniques. Foresight, 9(1), 525.CrossRefGoogle Scholar
Blank, S. (1984). The soviet strategic view: Ogarkov on the revolution in military technology. Strategic Review, 12(3), 390.Google Scholar
Boggan, S. (2021, February 8). China’s plan for medical domination. UnHerd. Google Scholar
Bussell, J. (2011). Explaining cross-national variation in government adoption of new technologies. International Studies Quarterly, 55(1), 267280.CrossRefGoogle Scholar
Calantone, R. J., Griffith, D. A., & Goksel, Y. (2006). An emprical examination of a technology adoption model for the context of China. Journal of International Marketing, 14(4), 127.CrossRefGoogle Scholar
Caldwell, D. J., Ojemann, J. G., & Rao, R. P. (2019). Direct electrical stimulation in electrocorticographic brain-computer interfaces: Enabling technologies for input to cortex. Frontiers in Neuroscience, 13, 804.CrossRefGoogle ScholarPubMed
Campbell, K. M., & Ratner, E. (2018). The China reckoning: How Beijing defied American expectations. Foreign Affairs, 97(2), 6070.Google Scholar
Canli, T., Brandon, S., Casebeer, W., Crowley, P. J., DuRousseau, D., Greely, H., & Pascual-Leone, A. (2007). Neuroethics and national security. American Journal of Bioethics, 7(5), 313.CrossRefGoogle ScholarPubMed
Central Compliation and Translation Beureau. (2016). The 13th five-year plan for economic and social development of the People’s Republic of China, 2016–2020. Central Committee of the Communist Party of China. Retrieved December 8, 2020, from Google Scholar
Chen, S. (2018, April 29). “Forget the Facebook leak”: China is mining data directly from workers’ brains on an industrial scale. South China Morning Post. Google Scholar
Chin, W. (2019, July). Technology, war and the state: Past, present and future. International Affairs, 95(4), 765783.CrossRefGoogle Scholar
CIA (Central Intelligence Agency). (2021a, March 22). China. The World Factbook. Retrieved March 30, 2021, from Google Scholar
CIA (Central Intelligence Agency). (2021b, March 17). United States. The World Factbook. Retrieved March 30, 2021, from Google Scholar
Clausewitz, C. v. (2013). The Essential Clausewitz: Selections from On War. Greene, J (Ed.) New York: Dover Publications, 55.Google Scholar
Cohen, E. (1996). A revolution in warfare. Foreign Affairs, 75(2), 3754.CrossRefGoogle Scholar
Colby, E. A., & Mitchell, A. W. (2020). The age of great-power competition: How the Trump administration refashioned American strategy. Foreign Affairs, 99(1), 118130.Google Scholar
Congressional Research Service. (2021, January 14). Export controls: Key challenges. Google Scholar
Corrales, J., & Westhoff, F. (2006). Information technology adoption and political regimes. International Studies Quarterly, 50(5), 911933.CrossRefGoogle Scholar
Cyranoski, D. (2018). Beijing launches pioneering brain-science centre. Nature, 556(7700), 157158.CrossRefGoogle ScholarPubMed
Dando, M. (2007). Preventing the future military misuse of neuroscience. In Rappert, B. (Ed.), Technology and security: Governing threats in the new millennium (pp. 155170). Palgrave Macmillan.CrossRefGoogle Scholar
DARPA (Defense Advanced Research Projects Agency). (2019, May 20). Six paths to the nonsurgical future of brain machine interfaces. Google Scholar
DARPA (Defense Advanced Research Projects Agency). (n.d.). DARPA and the BRAIN Initiative. Retrieved March 30, 2021, from Google Scholar
DeFranco, J., DiEuliis, D., & Giordano, J. (2019). Redefining neuroweapons: Emerging capabilities in neuroscience and neurotechnology. PRISM, 8(3), 4863.Google Scholar
DiEuliis, D., & Giordano, J. (2017). Why gene editors like CRISPR/Cas may be a game-changer for neuroweapons. Health Security, 15(3), 296302.CrossRefGoogle ScholarPubMed
Dimensions. (2021, March 25). A global inventory of brain projects. Google Scholar
Dobbins, J., Shatz, H. J., & Wyne, A. (2018). Russia is a rogue, not a peer; China is a peer, not a rogue: Different challenges, different responses. RAND Corporation. Google Scholar
Dombrowski, P., & Gholz, E. (2006). Buying military transformation: Technological innovation and the defense industry. Columbia University Press.CrossRefGoogle Scholar
Emanuel, P., Walper, S., DiEuliis, D., Klein, N., Petro, J. B., & Giordano, J. (2019). Cyborg soldier 2050: Human/machine fusion and the implications for the future of the DOD. U.S. Army Combat Capabilities Development Command, Chemical Biological Center.Google Scholar
Emondi, A. (n.d.). Next-generation nonsurgical neurotechnology. Retrieved March 30, 2021, from Google Scholar
EMOTIV. (n.d.). Enterprise neurotechnology solutions. Retrieved December 6, 2020, from Google Scholar
Emotivstation. (2017, August 18). EMOTIV x Rodrigo Hubner Mendes—Driving F1 car just by thinking. Google Scholar
Evangelista, M. (1989). Innovation and the arms race: How the United States and the Soviet Union develop new military technologies. Cornell University Press.Google Scholar
Evans, N. G. (2021). The ethics of neuroscience and national security. Routledge.CrossRefGoogle Scholar
Farnsworth, B. (2017, April 11). Top 14 EEG hardware companies [ranked]. Google Scholar
Ferchen, M. (2020, February). Towards “extreme competition”: Mapping the contours of US-China relations under the Biden administration. Merics China Monitor. Google Scholar
Fisher, C. E. (2010). Brain stimulation and national security: Considering the narratives of neuromodulation. AJOB Neuroscience, 1(2), 2224.CrossRefGoogle Scholar
Freeman, C. (1987). Technology, policy, and economic performance: lessons from Japan. Pinter Publishers, 17.Google Scholar
Frégnac, Y. (2017). Big data and the industrialization of neuroscience: A safe roadmap for understanding the brain? Science, 358(6362), 470477.CrossRefGoogle ScholarPubMed
Fukuyama, F. (2003). Our posthuman future: Consequences of the biotechnology revolution. Picador.Google Scholar
Funk, C., Kennedy, B., & Sciupac, E. P. (2016, July 26). U. S. public wary of biomedical technologies to “enhance” human abilities . Pew Research Center. Google Scholar
Gaddis, J. L. (1989). The long peace: Inquiries into the history of the Cold War. Oxford University Press.Google Scholar
Gazzaniga, M. S. (2005). The ethical brain. Dana Press.Google Scholar
Gerstner, W., Kreiter, A. K., Markram, H., & Herz, A. V. (1997). Neural codes: Firing rates and beyond. Proceedings of the National Academy of Sciences, 94(24), 1274012741.CrossRefGoogle ScholarPubMed
Gherman, D.-E., & Zander, T. O. (2021, September 11–16). An ethical perspective on passive BCI: Can we detect information from the human mind that is intended to be hidden? Neuroergonomics Conference. Google Scholar
Giordano, J., Forsythe, C., & Olds, J. (2010). Neuroscience, neurotechnology, and national security: The need for preparedness and an ethics of responsible action. AJOB Neuroscience, 1(2), 3536.CrossRefGoogle Scholar
Global Neuroethics Summit Delegates, Rommelfanger, K. S., Jeong, S.-J., Ema, A., Fukushi, T., Kasai, K., … Singh, I. (2018). Neuroethics questions to guide ethical research in the international brain initiatives. Neuron, 100(1), 1936.CrossRefGoogle ScholarPubMed
Google. (a) (n.d.). Retrieved November 11, 2020, from Google Patents: Google Scholar
Google. (b) (n.d.). Retrieved November 11, 2020, from Google Patents: Google Scholar
Grand View Research. (2020). Brain computer interface market size, share & trends analysis report by product (invasive, partially invasive, non-invasive), by application (healthcare, communication & control), by end-use, and segment forecasts, 2020–2027. Retrieved December 7, 2020, from Google Scholar
Gray, C. (1974). The urge to compete: Rationales for arms racing. World Politics, 26(2), 207233.CrossRefGoogle Scholar
Grimm, D. (2018, November 2). Record number of monkeys being used in U.S. research. Science. CrossRefGoogle Scholar
Heldman, D. A., & Moran, D. W. (2020). Local field potentials for BCI control. In Ramsey, N. F. & Millán, J. del R. (Eds.), Handbook of Clinical Neurology (vol. 168, pp. 279288). Elsevier.Google Scholar
Herculano-Houzel, S. (2012). The remarkable, yet not extraordinary, human brain as a scaled-up primate brain and its associated cost. Proceedings of the National Academy of the Sciences, 109(S1), 1066110668.CrossRefGoogle Scholar
Herspring, D. (1987). Nikolay Ogarkov and the scientific-technical revolution in Soviet military affairs. Comparative Strategy, 6(1), 2959.CrossRefGoogle Scholar
Herz, J. H. (1959). International politics in the atomic age. Columbia University Press.CrossRefGoogle Scholar
Hitchens, T. (2021, June 29). SOCOM to test anti-aging pill next year. Breaking Defense. Google Scholar
Insights, Hofstede. (n.d.). Country comparison. Retrieved December 2, 2020, from,the-usa/ Google Scholar
Hofstede, G., & Bond, M. H. (1984). Hofstede’s cultural dimensions: an independent validation using Rokeach’s value survey. Journal of Cross-Cultural Psychology, 15(4), 417433.CrossRefGoogle Scholar
Hofstede, G., Hofstede, G. J., & Minkov, M. (2010). Cultures and organizations: Software of the mind. McGraw-Hill Education.Google Scholar
Hourihan, M. (2020, October 22). A snapshot of U.S. R&D competitiveness: 2020 update. American Association for the Advancement of Science. Google Scholar
Huang, J. Y., & Kosal, M. E. (2008, June 20). Security implications of cognitive science research. Bulletin of the Atomic Scientists. Google Scholar
Hundley, R. O. (1999). Past revolutions, future transformations: What can the history of revolutions in military affairs tell us about transforming the US military? RAND Corporation. Google Scholar
International Brain Initiative. (n.d.). International Brain Initiative. Retrieved November 29, 2020, from Google Scholar
Jervis, R. (1978). Cooperation under the security dilemma. World Politics, 30(2), 167214.CrossRefGoogle Scholar
Jiang, L., Stocco, A., Losey, D. M., Abernethy, J. A., Prat, C. S., & Rao, R. P. (2019). BrainNet: A multi-person brain-to-brain interface for direct collaboration between brains. Scientific Reports, 9, 6115.CrossRefGoogle ScholarPubMed
Johnson, S. (2017, November 14). Brainwave headsets are making their way into classrooms - for meditation and discipline. EdSurge. Google Scholar
Jonas, E., & Kording, K. P. (2017). Could a neuroscientist understand a microprocessor? PLoS Computational Biology, 13(1), e1005268.CrossRefGoogle ScholarPubMed
Jones, B. (2020, February). China and the return of great power strategic competition. Brookings Institution. Google Scholar
Justo, L., & Erazun, F. (2007). Neuroethics and human rights. American Journal of Bioethics, 7(5), 1617.CrossRefGoogle ScholarPubMed
Kania, E. B. (2020). Minds at war: China’s pursuit of military advantage through cognitive science and biotechnology. Prism, 8, 83101.Google Scholar
Kerin, R. A., Varadarajan, P. R., & Peterson, R. A. (1992). First-mover advantage: A synthesis, conceptual framework, and research propositions. Journal of Marketing, 56(4), 3352.CrossRefGoogle Scholar
Kosal, M. (Ed.). (2019). Disruptive and game changing technologies in modern warfare: Development, use, and proliferation. Springer Academic.Google Scholar
Kosal, M. E., & Huang, J. H. (2015). Security implications and governance of cognitive neuroscience research: Results from an ethnographic survey of researchers. Politics and the Life Sciences, 34(1), 93108.CrossRefGoogle Scholar
Kosal, M., & Stayton, J. (2019). Meta-materials: threat to the global status quo? In Kosal, M. (Ed.), Disruptive and game changing technologies in modern warfare: Development, use, and proliferation (pp. 135154). Springer Academic.Google Scholar
Krepinevich, A. (1994). Cavalry to computer: The pattern of military revolutions. The National Interest, 37, 3042.Google Scholar
Krishnan, A. (2018). Military neuroscience and the coming age of neurowarfare. Routledge.Google Scholar
LaBrie, R. C., Steinke, G. H., Li, X., & Cazier, J. A. (2017). Big data analytics sentiment: US-China reaction to data collection by business and government. Technological Forecasting & Social Change, 130, 4555.CrossRefGoogle Scholar
Lee, S.-G., Trimi, S., & Kim, C. (2013). The impact of cultural differences on technology adoption. Journal of World Business, 48, 2029.CrossRefGoogle Scholar
Leins, K. (2021). New war technologies and international law: The legal limits to weaponising nanomaterials. Cambridge University Press.Google Scholar
Lewis, J. A. (2018, November 30). Technolocial competition and China. Center for Strategic and International Studies. Google Scholar
Li, C., & Zhao, W. (2019). Progress in the brain-computer interface: An interview with Bin He. National Science Review, 7, 480483.CrossRefGoogle ScholarPubMed
Lin, X., Sai, L., & Yuan, Z. (2018). Detecting concealed information with fused electroencephalography and functional near-infrared spectroscopy. Neuroscience, 386, 284294.CrossRefGoogle ScholarPubMed
Lozano, A. M., Lipsman, N., Bergman, H., Brown, P., Chabardes, S., Chang, J. W., … Krauss, J. K. (2019). Deep brain stimultaion: Current challenges and future directions. Nature Reviews Neurology, 15(3), 148160.CrossRefGoogle Scholar
Lunstroth, J., & Goldman, J. (2007). Ethical intelligence from neuroscience: Is it possible? American Journal of Bioethics, 7(5), 1820.CrossRefGoogle ScholarPubMed
Mahnken, T. (2010). Technology and the American way of war since 1945. Columbia University Press.Google Scholar
Maksimenko, V., Lüttjohann, A., van Heukelum, S., Kelderhuis, J., Makarov, V., Hramov, A., … van Luijtelaar, G. (2020). Brain-computer interface for the epileptic seizures prediction and prevention. IEEE 8th International Winter Conference on Brain-Computer Interface.CrossRefGoogle Scholar
Marcus, S. J. (Ed.). (2002). Neuroethics: Mapping the field. Dana Press.Google Scholar
Marks, J. H. (2010). A neuroskeptic’s guide to neuroethics and national security. AJOB Neuroscience, 1(2), 412.CrossRefGoogle Scholar
McKitrick, J., Blackwell, J., Littlepage, F., Kraus, G., Blanchfield, R., & Hill, D. (1995). The revolution in military affairs. In Schneider, B., & Grinter, L. (Eds.), Battlefield of the future: 21st century warfare issues (pp. 6598). Air University Press.Google Scholar
Melaas, A., & Zhang, F. (2016, March). National innovation systems in the United States and China. Center for International Environment and Resource Policy. Google Scholar
Meyer, S. (1984). The dynamics of nuclear proliferation. University of Chicago Press.Google Scholar
Milner, H. V. (2006). The digital divide: The role of political institutions in technology diffusion. Comparative Political Studies, 39(2), 176199.CrossRefGoogle Scholar
Moreno, J. D. (2003). Neuroethics: An agenda for neuroscience and society. Nature Reviews: Neuroscience, 4, 149153.CrossRefGoogle ScholarPubMed
Moreno, J. D. (2006). Mind wars: brain research and national defense. Dana Press.Google Scholar
Morgenthau, H. J. (1972). Science: Servant or master? Meridian Books.Google Scholar
Munyon, C. N. (2018). Neuroethics of non-primary brain computer interface: Focus on potential military applications. Frontiers in Neuroscience, 12, 696.CrossRefGoogle ScholarPubMed
Naam, R. (2005). More than human: Embracing the promise of biological enhancement. Broadway Books.Google Scholar
Nasu, H., & McLaughlin, R. (Eds.). (2014). New technologies and the law of armed conflict. Holland: TMC Asser.CrossRefGoogle Scholar
National Institutes of Health. (2014). BRAIN 2025: A scientific vision. Google Scholar
National Institutes of Health. (n.d.a). The Alliance. Retrieved March 30, 2021, from BRAIN Initiative web site: Google Scholar
National Institutes of Health. (n.d.b). Brain Initiative. Retrieved November 10, 2020, from Google Scholar
National Institutes of Health. (n.d.c). Overview. Retrieved November 29, 2020, from Brain Initiative: Google Scholar
National Research Council. (2008). Emerging cognitive neuroscience and related technologies. National Academies Press.Google Scholar
National Research Council. (2009). Opportunities in neuroscience for future army applications. National Academies Press.Google Scholar
National Security Commission on Artificial Intelligence. (2021). Final report. Google Scholar
NATO (North Atlantic Treaty Organization). (2010). NATO 2020: Assured security; dynamic engagement. NATO Public Diplomacy Division.Google Scholar
Newburger, E. (2019, August 24). Trump’s tariffs on monkeys could “severely damage” US medical research and send labs to China. CNBC. Google Scholar
Nicolas-Alonso, L. F., & Gomez-Gil, J. (2012). Brain computer interfaces, a review. Sensors, 12(2), 12111279.CrossRefGoogle ScholarPubMed
Nixdorff, K., Borisova, T., Komisarenko, S., & Dando, M. (2018). Dual-use nano-neurotechnology: An assessment of the implications of trends in science and technology. Politics and the Life Sciences, 37(2), 180202.CrossRefGoogle ScholarPubMed
Nofi, A. (2006). Recent trends in thinking about warfare. CNA Corporation.Google Scholar
Nuyujukian, P., Sanabria, J. A., Saab, J., Pandarinath, C., Jarosiewicz, B., Blabe, C. H., … Henderson, J. M. (2018). Cortical control of a tablet computer by people with paralysis. PLOS ONE, 13, e0204566.CrossRefGoogle ScholarPubMed
Office of Science and Technology Policy. (2015). A strategy for American innovation. Google Scholar
Ogburn, W. (1949). Technology and international relations. University of Chicago Press.Google Scholar
O’Hanlon, M. (2009). The science of war. Princeton University Press.CrossRefGoogle Scholar
O’Rourke, R. (2020). Great power competition: implications for defense—Issues for Congress. Congressional Research Service. Google Scholar
Pandarinath, C., Ames, K. C., Russo, A. A., Farschchian, A., Miller, L. E., Dyer, E. L., & Kao, J. C. (2018). Latent factors and dynamics in motor cortex and their application to brain-machine interfaces. Journal of Neuroscience, 38(44), 93909401.CrossRefGoogle ScholarPubMed
Parens, E. (2006). Creativity, gratitude, and the enhancement debate. In Illes, J. (Ed.), Neuroethics: Defining the issues in theory, practice, and policy (pp. 7586). Oxford University Press.Google Scholar
Polania, R., Nitsche, M. A., & Ruff, C. C. (2018). Studying and modifying brain function with non-invasive brain stimulation. Nature Neuroscience, 21(2), 174187.CrossRefGoogle ScholarPubMed
Poo, M.-M. (2016, May 26–27). China Brain Project and non-human primate research in China. The Brain Forum. Google Scholar
Poo, M.-M. (2018). Toward brain-inspired artificial intelligence. National Science Review, 5(6), 785.CrossRefGoogle Scholar
Poo, M.-M., Du, J.-L., Ip, N. Y., Xiong, Z.-Q., Xu, B., & Tan, T. (2016). China Brain Project: basic neuroscience, brain diseases, and brain-inspired computing. Neuron, 92, 591596.CrossRefGoogle ScholarPubMed
Rachna, M., & Agrawal, R. (2018). Emerging threats to security and privacy in brain computer interface. International Journal of Advanced Studies of Scientific Research, 3(12).Google Scholar
Ramadan, R. A., & Vasilakos, A. V. (2017). Brain computer interface: Control signals review. Neurocomputing, 223, 2644.CrossRefGoogle Scholar
Rasser, M. (2020, July 22). U.S.-China: Winning the economic competition. Testimony before the U.S. Senate Banking Committee, Subcommittee on Economic Policy. Google Scholar
Ratti, E., Waninger, S., Berka, C., Ruffini, G., & Verma, A. (2017). Comparison of medical and consumer wireless EEG systems for use in clinical trials. Frontiers in Human Neuroscience, 11, 398.CrossRefGoogle ScholarPubMed
Resnik, D. (2007). Neuroethics, national security, and secrecy. American Journal of Bioethics, 7(5), 15.CrossRefGoogle ScholarPubMed
Rippon, G., & Senior, C. (2010). Neuroscience has no role in national security. AJOB Neuroscience, 1(2), 3738.CrossRefGoogle Scholar
Rosen, S. (1991). Winning the next war: Innovation and the modern military. Cornell University Press.Google Scholar
Rosenberg, L., & Gehrie, E. (2007). Against the use of medical technologies for military or national security interests. American Journal of Bioethics, 7(5), 2224.CrossRefGoogle ScholarPubMed
Rotolo, D., Hicks, D., & Martin, B. R. (2015). What is an emerging technology? Research Policy, 44(10), 18271843.CrossRefGoogle Scholar
Society, Royal. (2012). Brain waves: Neuroscience, conflict and security. London.Google Scholar
Sagan, S., & Waltz, K. (2012). The spread of nuclear weapons: An enduring debate. W.W. Norton.Google Scholar
Salatino, J. W., Ludwig, K. A., Kozai, T. D., & Purcell, E. K. (2017). Glial responses to implanted electrodes in the brain. Nature Biomedical Engineering, 1, 862877.CrossRefGoogle ScholarPubMed
Sauter-Starace, F.,D, R., Cretallaz, C., Foerster, M., Lambert, A., Gaude, C., … Torres-Martinez, N. (2019). Long-term sheep implantation of WIMAGINE, a wireless 64-channel electrocorticogram recorder. Frontiers in Neuroscience, 13, 847.CrossRefGoogle ScholarPubMed
SCHEER (Scientific Committee on Health, Environmental, and Emerging Risks). (2017). Final opinion on “the need for non-human primates in biomedical research, production and testing of products and devices” (update 2017). European Commission. Google Scholar
Schmid, J., & Huang, J. (2017). State adoption of transformative technology: Early railroad adoption in China and Japan. International Studies Quarterly, 61(3), 570583.Google Scholar
Schwartz, P. (1996). The art of the long view: Planning for the future in an uncertain world. Doubleday.Google Scholar
Shambaugh, J., Nunn, R., & Portman, B. (2017). Eleven facts about innovation and patents. The Hamilton Project. Google Scholar
Shen, X. (2019, April 5). Can brainwave-monitoring headbands help students focus? South China Morning Post. Google Scholar
Skolnikoff, E. B. (1993). The elusive transformation: Science, technology, and the evolution of international affairs. Princeton University Press.Google Scholar
Solingen, E. (1994). Scientists and the state: Domestic structures and the international context. University of Michigan Press.CrossRefGoogle Scholar
Srite, M. (2006). Culture as an explanation of technology acceptance differences: an emprical investigation of Chinese and US users. Australasian Journal of Information Systems, 14, 526.CrossRefGoogle Scholar
Stein, J. G. (2009). Rational deterrence against “irrational” adversaries? No common knowledge. In Paul, T., & Morgan, P. W. (Eds.), Complex deterrence (pp. 5882). University of Chicago Press.Google Scholar
Stockton, N. (2015, March 5). Woman controls a fighter jet sim using only her mind. Wired. Google Scholar
Strange, S. (1996). The retreat of the state: The diffusion of power in the world economy. Cambridge University Press.CrossRefGoogle Scholar
Tatum, W. O., Rubboli, G., Kaplan, P. W., Mirsatari, S. M., Radhakrishnan, K., Gloss, D., … Beniczky, S. (2018). Clinical utility of EEG in diagnosing and monitoring epilepsy in adults. Clinical Neurophysiology, 129(5), 10561082.CrossRefGoogle ScholarPubMed
Taylor, M. Z. (2016). The politics of innovation: Why some countries are better than others at science and technology. Oxford University Press.CrossRefGoogle Scholar
Tellis, A. J. (2020). The return of U.S.-China strategic competition. In Tellis, A. J., Szalwinski, A., & Wills, M. (Eds.), Strategic Asia 2020: U.S.-China competition for global influence (pp. 343). National Bureau of Asian Research.Google Scholar
Tracey, I., & Flower, R. (2014). The warrior in the machine: Neurosciece goes to war. Nature Reviews: Neuroscience, 15(12), 825834.CrossRefGoogle Scholar
Trimper, J. B., Wolpe, P. R., & Rommelfanger, K. S. (2014). When “I” becomes “we”: Ethical implications of emerging brain-to-brain interfacing technologies. Frontiers in Neuroengineering, 7, 4.CrossRefGoogle Scholar
Turner, R. (1943). Technology and geopolitics. Military Affairs, 7(1), 515.CrossRefGoogle Scholar
U.S. Department of Defense. (2018). Summary of the 2018 national defense strategy of the United States of America. Google Scholar
U.S. Department of Defense. (2020). Annual report to Congress: Military and security developments regarding the People’s Republic of China. Google Scholar
Vahle, M. W. (2020, July 27). Opportunities and implications of brain-computer interface technology (Wright Flyer Paper 75). Air University Press. Google Scholar
Wild, J. (2005, September 22). Brain-imaging ready to detect terrorists, say neuroscientists. Nature, 437(7058), 457.CrossRefGoogle ScholarPubMed
Winick, E. (2018, April 30). With brain-scanning hats, China signals it has no interest in workers’ privacy. MIT Technology Review. Google Scholar
Wray, C. (2020, February 6). FBI Director Christopher Wray’s opening remarks: China Initiative Conference. Center for Strategic and International Studies. Google Scholar
Wu, X. (2020). Technology, power, and uncontrolled great power strategic competition between China and the United States. China International Strategy Review, 2, 99119.CrossRefGoogle Scholar
Wunker, S. (2012). Better growth decisions: Early mover, fast follower, or late follower? Strategy & Leadership, 40(2), 4348.CrossRefGoogle Scholar
Yin, E. (2019). Brain talker makes “mind reading” possible—Tianjin creates the world’s first brain-computer codec chip. Retrieved March 31, 2021, from Google Scholar
Zhang, S. (2020, August 31). America is running low on a crucial resource for COVID-19 vaccines. The Atlantic. Google Scholar
Zimmer, K. (2018, August 21). As primate research drops in Europe, overseas options appeal. The Scientist. Google Scholar
Figure 0

Figure 1. Three dual-use applications of BCIs (brain-computer, brain-computer-device, and brain-computer-brain technologies) can be categorized by their level of invasiveness and their functional ability to read or write brain activity. The symbols for each technology are shown in the category that corresponds to the theoretical minimum functionality and invasiveness required to use them, while the shaded areas demonstrate the other categories these technologies fall under. Greater invasiveness generally leads to greater data fidelity/interpretability. The ability to write brain activity noninvasively is an active area of research.

Figure 1

Table 1. Hofstede’s cultural dimensions.

Figure 2

Figure 2. A comparison of Hofstede’s cultural dimensions for the United States and China; adapted from LaBrie et al. (2017).

Figure 3

Table 2. Summary of qualitative and quantitative indicators of BCI adoption.