Hostname: page-component-857557d7f7-zv5th Total loading time: 0 Render date: 2025-12-08T10:19:42.884Z Has data issue: false hasContentIssue false

Science advising and democracy: Governing gene editing technologies

Published online by Cambridge University Press:  02 December 2025

Amber Knight*
Affiliation:
UNC Charlotte , Political Science and Public Administration, Charlotte, NC, USA

Abstract

Policymakers often consult university scientists as they design and administer policies to address issues facing contemporary democracies, including climate change and global pandemics. Subsequently, democratic theorists have become interested in how science advising generates both challenges and opportunities for democratic governance. As a work of applied political theory, this article contributes to current debates over the political implications of scientific expertise—and engages with Zeynep Pamuk’s writings, in particular—through a sustained focus on impending regulatory decisions surrounding novel gene editing technologies. I show how political decisions have created incentives for university scientists to commercialize research and develop partnerships with biotechnology corporations. In turn, the academic-industrial complex has both compromised the integrity of scientific research and also impaired scientists’ capacities to offer disinterested advice in the public interest. I conclude by recommending the development of a more robust and expansive regulatory environment that can restore public trust in scientific expertise.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The Association for Politics and the Life Sciences

We are living in the midst of a scientific revolution, one that was set in motion by the discovery and development of a gene editing tool known as Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)–Cas9.Footnote 1 First recognized for its potential to modify genes roughly a decade ago, this technology acts as a pair of “molecular scissors” to modify sequences of DNA. CRISPR is relatively cheap and easy to use, and it outperforms previous tools. In 2015, Science named CRISPR its “Breakthrough of the Year,” describing the technology as a “molecular marvel.” In 2020, Jennifer Doudna and Emmanuelle Charpentier received the Nobel Prize in Chemistry for their roles in pioneering CRISPR technology. Reflecting on this achievement, history professor Walter Isaacson (Reference Isaacson2020) characterized CRISPR as the most important scientific advance since the discovery of the DNA double helix, even going so far as to herald it as “an invention that will transform the human race.”

While the potential clinical applications are seemingly endless for modern medicine, they are also vexed. CRISPR holds therapeutic promise for success in dealing with genetic diseases, like cystic fibrosis and cancer, to the extent that scientists discover how to safely and efficiently target and repair genes in somatic cells. However, editing somatic cells is remarkably different from editing germline cells found in sperm, ova, or the DNA of an embryo. Gene therapies for somatic cells remain confined to the person receiving treatment, meaning that such modifications are not inherited by future generations. By contrast, changes to the nuclei of germline cells affect all cells of the future person’s body, and the modifications are then inherited by their offspring in perpetuity. Germline modifications also raise the specter of so-called “designer babies,” wherein embryos could be genetically modified to produce children with socially desirable traits like heightened strength, increased intelligence, and even resistance to pain (Greely, Reference Greely2021). Hence, the somatic/germline distinction is scientifically, ethically, and politically salient.

Clinical trials for somatic gene therapies are well underway, but germline gene editing in humans is currently restricted in the United States. It is not banned outright, but an informal moratorium has been imposed under the guidelines of the National Institutes of Health (NIH) and in accordance with the Food and Drug Administration’s (FDA) funding prohibition on the ability to use federal money to review clinical studies “in which a human embryo is intentionally created or modified to include a heritable genetic modification” (Matthews & Morali, Reference Matthews and Morali2022, p. 8). In effect, the prohibition limits clinical research since scientists must submit an investigational new drug application to the FDA before they can use edited cells in humans, even experimentally.

Many stakeholders are eager to lift this temporary ban. Hence, the political question of whether scientists should be allowed to modify the human germline in basic research and/or clinical applications (and, if so, with what limits on research and/or medical practice) is one that democratic societies must urgently confront. To date, molecular biologists, biochemists, geneticists, and scientists in related fields have been afforded a great deal of influence over public opinion about the kind of research we ought to pursue and how we should, or should not, go about regulating it. A recent study of news coverage about CRISPR found that university scientists were far and away the most heavily represented voices, having been cited or paraphrased in the bulk of news articles on the topic, even when the piece addressed social and ethical (rather than technical) implications (Thiel, Reference Thiel2021). In addition, many university scientists have given high-profile public lectures, provided testimony before legislatures, and written policy briefs. As a group, they have virtually monopolized the discussion over which hopes about germline gene editing should remain in the realm of fantasyland and which ones should enter the domain of political possibility.

In recent years, political scientists and democratic theorists have become increasingly interested in how scientific expertise and advising generate both challenges and opportunities for contemporary democracies (Brown, Reference Brown2009; Douglas, Reference Douglas, Hannon and de Ridder2021; Moore, Reference Moore2017; Pamuk, Reference Pamuk2021; Pielke, Reference Pielke2007). This article contributes to current debates over the proper role of scientific expertise in democratic decision-making through a sustained focus on impending regulatory decisions surrounding gene editing technologies. As a work of applied political theory, my central objective is to illuminate how public policies and legal decisions have created incentives for universities and their faculty to commercialize CRISPR research and develop partnerships with corporations to fund their programs. Leading high-profile university scientists working on the development of gene editing technologies have also founded biotech companies and/or hold significant shares in startup companies. Scientists and the elite academic institutions that employ them stand to make a king’s fortune—hundreds of millions of dollars in royalties—from the commercialization of these products. Hence, academic scientists do not give interviews to news outlets or advise elected officials as disinterested parties. To the contrary, they have strong incentives to promote the benefits of CRISPR technology and downplay the need for increased regulations that might threaten commercialization. If disinterested counsel in the public interest is a democratic ideal that should guide science advising, university-corporate ties make the ideal hard to put into practice.

The article proceeds in four parts. The first section reviews the political theory literature on the role of science advising in democracies, paying particular attention to Zeynep Pamuk’s timely and insightful book, Politics and Expertise: How to Use Science in a Democratic Society (2021). The second section applies Pamuk’s innovative recommendations to democratize science to the current landscape of genetic research, arguing that it cannot sufficiently address how the commercialization of academic research has compromised the integrity of science advising. Next, the third section identifies and explains how legal and political developments around patenting law transformed the relationship between academia and industry. This section demonstrates how the boundaries between universities and biotechnology companies have blurred over time, resulting in an unprecedented rise in financial conflicts of interest that can impair scientists’ capacities to offer unbiased, public-minded advice as disinterested parties. The final section suggests that existing conflict of interest (COI) policies cannot neutralize the prestige and financial stakes at play and recommends some additional changes to the academic-industrial complex.

To be clear, my goal is not to vilify individual scientists. I assume that most university scientists are well-meaning people who work in good faith to understand how the natural world works in order to make it a better place. Rather, the primary aim of this analysis is to show how there are a number of problematic macro-level forces that compromise the epistemic and moral integrity of scientific advising to the detriment of democratic decision-making. In doing so, the matter is portrayed as a structural political problem in the funding and organization of academic science as an institution, rather than as a matter of personal intent.

Science advising and democratic decision-making

Major issues facing contemporary democratic societies—including climate change and global pandemics—often rely on expert knowledge from credentialed scientists who are rigorously trained in the methods and practices of inquiry in their respective fields. Using precise, controlled, systematic methods of measuring and testing knowledge, scientists analyze the source and scope of the problems and recommend potential solutions to political officials. It is nearly impossible to design and administer effective policies to reduce carbon emissions or curb the spread of the coronavirus, for example, without taking advice from scientific experts into consideration. The integral role that scientific expertise plays in democratic governance is evident in the proliferation of scientific advisory committees, which have become so prevalent that Shelia Jasanoff (Reference Jasanoff1990) describes them as a “fifth branch” of government.

Yet, political theorists have long questioned the proper role of experts in political rule. As early as Plato’s Republic (375 BC), political theorists identified a tension between democracy and expertise.Footnote 2 Democracy connotes equality and emphasizes the rule of the people. Expertise signals an epistemic asymmetry between specialists and laypeople, and suggests that some may be better qualified than others to provide input or make certain decisions on various topics. As Alfred Moore (Reference Moore2016) explains, “The idea that people ought to have an equal opportunity to contribute to deliberation on matters that affect them seems to be short-circuited by the inequalities in knowledge that are necessary for the effective analysis, regulation and management of complex social and technological problems” (p. 192).

Political theorists have navigated the tension between democracy and expertise in various ways. For instance, John Dewey (Reference Dewey1910) questioned the place of science in the project of democratic governance. Scientific knowledge, according to Dewey, is “systematized” through a process of “active experimenting and testing,” and, as such, it is distinctive from “mere opinion or guess-work or dogma” (p. 125). Dewey esteemed science because it was useful to the democratic state. That is, Dewey did not want knowledge created by the production of “pure science” to remain separate from political values and judgments. He wanted science to inform them. Not only did Dewey believe in the power of science education to create informed publics capable of self-governance, but he also suggested that well-tested knowledge developed through scientific research is often instrumental to the development of good policies and favorable political outcomes. In Dewey’s words, “genuinely public policy cannot be generated unless it be informed by knowledge, and this knowledge does not exist except when there is systematic, thorough, and well-equipped search and record” (Dewey, Reference Dewey1927, p. 178). In sum, Dewey believed that scientific knowledge could enhance the quality of political decisions and policy design.

That said, the professionalization of science changed the relationship between science and democracy, introducing a new set of political concerns. The professionalization of science followed the proliferation of modern research universities with salaried positions, the establishment of scientific societies (such as the American Association for the Advancement of Science), and the increase in both government and industrial investment in research in the early twentieth century. Moreover, two critical junctures expedited the growth and professionalization of science in America: World War II (and the Manhattan Project’s development of the atomic bomb) and the Cold War (and the National Aeronautics and Space Administration’s so-called “space race”). By the late twentieth century, “big science” projects received massive sums of government revenue, which increased the scope and power of scientific research by emphasizing its importance for national security and economic growth. During this era, influential science policy scholars and administrators—such as Vannevar Bush (Reference Bush1945) and Michael Polanyi (Reference Polanyi1962)—believed that scientists’ objectivity and neutrality made them apolitical, earning them a special authority to set their own research agendas, establish professional standards for good laboratory and clinical practice, and self-regulate. In effect, postwar scientists became a credible and socially esteemed expert class.

The growing gap in epistemic authority between professionalized scientific experts and laypersons raised concerns about technocratic rule and excessive scientific management of evidence-based policymaking. Jürgen Habermas (Reference Habermas1971), for example, warned about the undemocratic implications of giving scientific experts too much control, arguing that it effectively leads to an uneven distribution of power between unelected experts, on the one hand, and ordinary citizens and elected politicians, on the other. Habermas cautioned that the technocratic “scientization of politics” effectively “severs the criteria for justifying the organization of social life from any normative regulation of interaction, thus depoliticizing them” and suppressing the need for democratic deliberation (1971, p. 112). Instead of residing in collective democratic self-determination, politics gives way to the technocratic administration of public affairs in the hands of small groups of experts, leaving ordinary citizens on the sidelines. By the same token, basing policy on expert scientific knowledge turns the elected politician into “a mere agent of a scientific intelligentsia,” leaving them with “fictitious decision-making power” (1971, pp. 62–63).

In the last few years, however, fears of technocracy have given way to worries over the opposite: the rise of right-wing populism’s vehement anti-intellectualism, the erosion of trust in scientific expertise, and the emergence of “post-truth” politics. In reaction, some contemporary political philosophers, like Jason Brennan (Reference Brennan2016), have questioned the virtues of democratic decision-making. Brennan suggests that democracies tend to produce poor policy outcomes because voters are generally uninformed, apathetic, and biased. He contends that the people have a right to competent government, arguing that we should not leave political decisions in the hands of the “ignorant, irrational, misinformed nationalists” of general elections (2016, p. 23). In making his case, Brennan elevates the role of scientists in democratic decision-making by advocating for an epistocracy, wherein more informed citizens have more power than less informed ones. Brennan even goes so far as to support the revival of literacy tests for voting.

This dynamic—wherein populism triggers a backlash to technocracy, and technocracy elicits a backlash to populism—animates current debates about the proper role of scientific expertise in democratic governance. Yet, neither extreme is ideal. Populist decision-making uncoupled from facts and evidence-based solutions is less than desirable, as is technocratic decision-making divorced from transparency and popular accountability. Given the downsides to each extreme, the relevant question remains: how can democracies use scientific expertise to create policies that promote the public interest while also ensuring that decisions are not delegated to unelected experts with no accountability?

Zeynep Pamuk’s timely and thought-provoking book, Politics and Expertise: How to Use Science in a Democratic Society (Reference Pamuk2021), grapples with this question. From the start, Pamuk acknowledges that science itself is almost always uncertain and subject to disagreement. Scientists advising policymakers are inevitably impeded by “the limits and uncertainty of scientific knowledge and disagreements over values,” so they must make both implicit and explicit judgments about which knowledge to pursue, which hypotheses to test, which models to construct, and more (2021, p. 21). She argues that these judgments should not be left to scientific experts alone, but should be open to democratic scrutiny and control. Pamuk thus argues in favor of expanding the scope of public engagement at all stages of the scientific process, even during the funding and research design stages, arguing that “the conceptual, methodological, and theoretical choices of scientists must be open to democratic scrutiny when science is used as the basis for political decisions” (2021, pp. 61). She explains why, writing,

Democracy requires citizens to have an equal opportunity to influence and contest the decisions to which they are subject. But if the values and aims presupposed in scientific research shape decisions, then an important determinant of the policy process is shielded from citizen input and scrutiny… This creates an inequality in opportunities for political influence… It follows that the public must have a meaningful chance to engage with the science and examine the ends that scientists have presupposed.

(2021, pp. 50–51)

To that end, Pamuk proposes a variety of tactics to enable direct interaction between experts and nonexperts, strengthen scientific literacy among the public, and design institutions that facilitate the exercise of political judgment in scientific matters. For instance, she advocates the development of a new institution, which she dubs a “science court,” wherein a random sample of citizens is chosen to comprise a citizen jury. Scientific experts are invited to argue competing views on a scientific question before the jury, which would then question the experts, deliberate, and deliver a decision. In turn, the ruling would inform public debates and serve an advisory role in the policy-making process (2021, pp. 112–123).

Likewise, Pamuk also reimagines procedures for funding science. Rather than allow the scientific community to control the purse strings vis-à-vis peer review, she recommends that a certain portion of research funds should be distributed through lotteries. She also recommends earmarking funds for novel and innovative projects for early career researchers (2021, pp. 157–159). Why? Scientists themselves can too easily gatekeep and deny funding to new research projects that could potentially propose radical new approaches or challenge existing paradigms on which established scientists have built their careers.

All in all, Pamuk’s approach imagines how to design institutions to provide more opportunities for increased interactions between citizens and experts to make scientists more accountable to the lay public. Interaction breaks down the insularity and autonomy of the scientific community, which Pamuk considers to be the primary threat to democracy. In her words, “the main tensions animating the relationship between democracy and science are due to the existence and social authority of an independent, autonomous scientific community, which is free to pursue knowledge in areas of its own choosing” without allowing citizens to have a say over the direction and use of research they support with their taxes (2021, p. 23). Pamuk’s argument persuasively illustrates how any idealization of science as being unencumbered by values is an illusion, since scientific knowledge and the political context of its use are imperfect.

Democratizing science: Beyond public engagement

How does Pamuk’s analysis apply to the current landscape of genetic biomedical research? Pamuk would undoubtedly bemoan ongoing efforts to protect scientific autonomy at the expense of more substantial public engagement. Given the widespread scientific illiteracy of citizens and politicians, scientists have for decades diligently guarded against what they perceive as excessive populist intervention into the pace and direction of genetic research. The 1975 Asilomar Conference on Recombinant DNA Research is touted as a success story in scientific self-regulation. Funded by the NIH and convened by the National Academy of Sciences, the meeting brought experts together to establish consensus on how to conduct responsible recombinant DNA research, which involves cutting DNA at specified places and bringing together genetic material from multiple sources (i.e., fungi, humans, and bacteria) to produce a new genetic sequence. The attendees developed a set of principles to review all funding requests, guide research in laboratory practice, and develop containment strategies. In doing so, scientists at Asilomar positioned themselves as “custodians of the public good as well as their own sovereign territory” (Hurlbut, Reference Hurlbut2025, p. 469). Much of the scientific community applauds this historic meeting for containing controversy and building scientific consensus about how to responsibly move forward. As science and technology scholars have observed, the Asilomar Conference effectively set a precedent as a governance model for biotechnology research that has extended into the CRISPR era: if safety concerns to people and ecosystems are adequately addressed, it is assumed that regulation can reasonably be left to expert judgment among scientists themselves (Jasanoff, Reference Jasanoff2019; Parthasarathy, Reference Parthasarathy2015).

Scientists working on CRISPR technologies have largely followed in Asilomar’s footsteps. In 2015, a group of scientists met in Napa, California, to consider the implications of CRISPR for germline gene editing. Afterward, they published a statement in Science, suggesting that it would be irresponsible to go ahead with clinical research on germline editing until safety issues had been resolved. Rather than call for a formal moratorium, the statement proposed a prudent path forward for germline engineering and germline gene modification (Baltimore et al., Reference Baltimore, Berg, Botchan, Carroll, Charo, Church, Corn, Daley, Doudna, Fenner, Greely, Jinek, Martin, Penhoet, Puck, Sterberg, Weissman and Yamamoto2015). That same year, the first International Summit on Gene Editing was convened in Washington, DC. Following the Summit, a lengthy report released from the National Academy of Sciences (2017) stated that it would be ethically acceptable to develop a “responsible pathway” toward germline genome interventions to treat serious, rare inherited conditions if the technology is developed to safely avoid hazards, including off-target effects and genetic mosaicism. As the report states,

Heritable germline genome-editing trials must be approached with caution, but caution does not mean they must be prohibited. If the technical challenges are overcome and potential benefits are reasonable in light of the risks, clinical trials could be initiated

(The National Academies Press, 2017, p. 189).

The scientific community thus warded off the threat of heavy-handed legislative regulation by assuring the public that they were proceeding with caution.

All in all, this influential report was broadly conceived as having given a “yellow light” to scientists working on gene editing research. This yellow light quickly created confusion. In 2018, at the Second International Summit on Human Genome Editing, Chinese biophysicist He Jiankui announced the birth of nonidentical twin girls whom he edited as embryos, using CRISPR–Cas9, to make them immune to human immunodeficiency virus infection by eliminating a gene called CCR5. Reeling from the shock of his announcement that the world’s first CRISPR babies had entered the world, prominent scientists from around the globe immediately condemned He Jiankui’s experiment as immoral, medically unnecessary, and scientifically premature (Greely, Reference Greely2021). In 2019, a Chinese Court discovered that Jiankui and two collaborators forged ethical review documents and misled doctors into unknowingly implanting gene-edited embryos. He was fined three million yuan and sentenced to prison for 3 years after being convicted of illegal medical practice.

Despite the global outcry, scientists have largely portrayed He Jiankui as a rogue actor guilty of botching his experiment and doing “bad science.” Scientists do not all think alike or speak in one voice, but most high-profile scientists have doubled down on their commitment to carefully and cautiously march ahead. The general sentiment within the scientific community has remained fairly consistent: basic research should continue to be conducted to work out the kinks (including off-target effects and genetic mosaicism) and pave a translational pathway forward toward clinical research. The debate within the scientific community continues to be framed as one over how, not whether, to move forward. Indeed, now that He Jiankui has opened Pandora’s box, a great deal of rhetoric within the scientific community has emphasized the inevitability of human germline gene editing, suggesting the question of whether we should move forward with germline editing is no longer on the table. The world has stepped over the threshold of germline manipulation, so scientists are basically telling the public that we had better get with the program and trust them to responsibly guide us forward.

To return to Pamuk’s point, the He Jiankui scandal illustrates why decision-making about the research design, funding, and regulation of germline genetic technologies in human subjects should not be the exclusive purview of a select few scientists. Scientists themselves are increasingly recognizing the point, as evidenced by a 2020 report issued by the International Commission on the Clinical Use of Human Germline Editing. This Commission, which was established in the wake of Jiankui’s experiment, has called for “widespread societal engagement and approval” for the sake of “transparency and accountability” before clinical use of embryonic gene editing proceeds further (The National Academies Press, 2020, p. 150). Suggesting that it was beyond their mandate to conduct this societal discourse themselves, or even provide guidelines for how to best do so, the commission deferred to social scientists and bioethicists to recommend “how to undertake social engagement, how to engage diverse views, and how to support and sustain such efforts at national and international levels” (p. 150).

Social scientists and bioethicists have risen to the challenge. Some scholars have supported initiatives to recruit a more diverse array of stakeholders to attend international summits and conferences on CRISPR technology (Baylis, Reference Baylis2019). Others have proposed convening an international Citizens’ Assembly—comprising randomly selected lay participants—to write a report on global principles for the regulation of genome editing technologies (Dryzek et al., Reference Dryzek, Nicol, Niemeyer, Pemberton, Curato, Bachtiger, Batterham, Bedsted, Burall, Burgess, Burgio, Castelfranchi, Chneiweiss, Church, Rossley, De Vries, Farooque, Hammond, He, Mendonca, Merchant, Middleton, Rasko, Van Hoyweghen and Vergne2020). Additionally, in the spirit of Pamuk’s book, some scholars have even advocated that the International Commission’s next report use a Citizens’ Jury—consisting of members of the general public chosen by random selection—to inform agenda-setting, resource allocation, and governance decisions on human germline gene editing (Sheinerman, Reference Sheinerman2023). Many social scientists and bioethicists concerned with the social and ethical implications of germline gene editing technologies share Pamuk’s commitment to increasing global public engagement in the research and development process and with respect to matters involving regulation and governance.

All in all, efforts to broaden the debate and strengthen public engagement are commendable. The human genome belongs to us all, so decisions about CRISPR research and its applications should be made with the full and direct input of all affected by the decisions. The issue is a matter of broad public concern, since the long-term ramifications for safety, social justice, and the evolution of the species are profound and irreversible. Closed decision-making is undemocratic, and public distrust in science grows when people feel cut out of the conversation.

Yet, Pamuk nevertheless offers an incomplete explanation of the tensions between science and democracy. Recall Pamuk’s contention that “the main tensions animating the relationship between democracy and science are due to the existence and social authority of an independent, autonomous scientific community” (2021, p. 23). While the insular nature of science should concern democratic theorists, the threats wrought by the commercialization of academic research also militate against the prospect of using scientific expertise to create policies that promote the public interest. Put differently, Pamuk’s account of the undemocratic nature of science is inadequate without thoughtful consideration of the economic and commercial forces that structure interactions between science and democracy. For example, Pamuk focuses on how funding decisions for science should be made without devoting attention to the major role that corporations currently play in funding science programs on college campuses. This omission is puzzling given that corporations have considerable power to set academic research agendas.

The problem, therefore, is not just that democratic citizens are beholden to the scientific community’s independent decisions about research priorities and regulations. Rather, the problem is that the academic scientific community is now beholden to corporate agendas. The vast majority of leading biologists in US universities are affiliated with biotechnology companies through university-corporate contracts, seats on corporate advisory boards, consulting jobs, or personal equity stakes in one or more biotech firms. The sway of commercialization on academic science cannot be overstated. As the following section shows, the drive for research funding and personal profit has blurred boundaries between universities and corporations, turned scientists into entrepreneurs, and contaminated the integrity of science advising in democratic decision-making. The broader implication is that it is not enough to promote broader public engagement in science. Policy reform of the academic-industrial complex is sorely needed as well.

CRISPR and the academic-industrial complex

Scientific research is heavily influenced by commercial interests, even when it is conducted by university scientists on college campuses. For instance, agribusiness and food corporations often provide large sums of funding for agricultural research at land-grant universities (Rudy et al., Reference Rudy, Coppin, Konefal, Shaw, Eyck, Harris and Busch2007). It is also common practice for universities to accept funds from fossil fuel companies for energy research (Thacker, Reference Thacker2022). As has been well-documented, biomedical research has been particularly prone to corporate influence (Cobb, Reference Cobb2022; Kenney, Reference Kenney1986; Krimsky, Reference Krimsky1991; Stevens & Newman, Reference Stevens and Newman2019; Yi, Reference Yi2015). Biochemist and Nobel laureate Paul Berg describes the current state of affairs, stating,

When we spliced the profit gene into academic culture, we created a new organism– the recombinant university. We reprogrammed the incentives that guide science. The rule in academia used to be “publish or perish.” Now bioscientists have an alternative– “patent and profit.” (quoted in Abate, Reference Abate2001)

Who spliced the “profit gene” into academic culture? Decades ago, science and business operated in relatively separate spheres. Basic research was conducted by universities, and for-profit companies commercialized applications of basic science downstream in the development process by indirectly transferring academic knowledge to industry by way of publications, conferences, and consulting. By and large, businesses did not engage in much basic science, and university scientists did not do much business. Moreover, university scientists generally avoided patenting inventions or were even hostile toward the idea, worrying that it was an ignoble pursuit that conflicted with higher education’s mission to freely disseminate knowledge as stewards of truth. In a famous example, Jonas Salk, who pioneered a polio vaccine at the University of Pittsburgh in the 1950s, found it unfathomable to patent and license his invention. When a news anchor asked Salk who owned the patent, he replied, “Well, the people, I would say. There is no patent. Could you patent the sun?” (quoted in Oshinsky, Reference Oshinsky2005, p. 211). Salk strongly believed that academia should remain an ivory tower secluded from the corrupting influence of corporate money.

Academic research in the biosciences was transformed by several political developments in the 1980s. The first was Diamond v. Chakrabarty (1980), a US Supreme Court decision that approved the patenting of a genetically modified bacterium capable of breaking down crude oil. As a postdoctoral researcher at the University of Illinois, Ananda Chakrabarty observed how some strains of bacteria could degrade hydrocarbons. Later, when he was an employee at General Electric, Chakrabarty discovered a method for genetic cross-linking that enabled him to produce two microbial strains that could digest crude oil after spills. Chakrabarty filed patents for the process and products. The Patent Office rejected his claims. After several rounds of appeal, the case made its way to the Supreme Court. Before this ruling, living things were not considered patentable. After the ruling, a “live, human-made micro-organism” was deemed patentable subject matter.Footnote 3 The decision laid the foundation for the modern biotechnology industry.

In addition, the Bayh-Dole Act (1980) enabled universities, nonprofit research institutions, and small businesses to own, patent, and commercialize inventions developed under federally funded research programs within their organizations. This legislation incentivized universities to become actively involved in the transfer of technology from lab to market. Universities holding the patent to a university discovery could keep a portion of the royalties and share a portion with the individual faculty inventors. Typically, university innovations are then licensed to private companies, which are often (although not exclusively) startups formed by faculty members. In effect, the Bayh-Dole Act enabled universities to generate revenue from government-sponsored research. Technology transfer offices were instituted to handle the patenting and licensing process on their respective campuses. The results are striking. Patents awarded to US academic institutions quadrupled between 1981 and 2020, with a large number granted to inventions in the biosciences (pharmaceuticals) and defense (weapons manufacturing). Bayh-Dole also spurred significant growth of startup companies, with more than 11,000 startups emerging out of university research between 1995 and 2017 (Abdulla & Corrigan, Reference Abdulla and Corrigan2022).

At the same time that court decisions and federal policies incentivized academic patenting and university-industry partnerships, state and federal government expenditures on research and development began to flatline or even shrink. These trends are not coincidental or independent of one another. The share of state budgets going to higher education nationally has shrunk by more than one-third since 1980 (Pilaar, Reference Pilaar2020). The federal portion of funding for university research has also gradually waned over time. In fact, the federal government no longer funds a majority of the basic research carried out in the United States. The federal share of funding for basic research fell below 50% for the first time in 2013, and this share has continued to decline. In the 1960s, the federal government outspent industry on basic research by a two-to-one margin, but the balance tipped in the 1980s, and by now the situation has reversed, as roughly two-thirds is now funded by private money, with the bulk of funding coming from pharmaceutical companies (Mervis, Reference Mervis2017). Faced with decreases in government funding, cash-strapped schools have eagerly welcomed revenue generated through royalties as well as corporate research contracts.

Not only has scientific commercialization created revenue for universities but some individual faculty members have struck gold as well. Most biomedical scientists at research institutions supplement their faculty salaries by working through consulting contracts, honoraria, royalties, and equity holdings in startups. As Sheila Jasanoff (Reference Jasanoff2019) explains, by the late 1990s, “it was the stock-holding biologist, at ease with business dealings, who was in the driver’s seat, and it was the scientist detached from the lure of the market who had become the endangered species” (p. 31). Close relationships with biotechnology companies can create serious financial conflicts of interest. Such conflicts occur when there is a possibility that an individual’s private financial interests may influence their professional actions, decisions, or judgment in pursuing, conducting, or reporting research (Contreras & Rinehart, Reference Contreras, Rinehart and Rooksby2019).

In the case of CRISPR gene editing technologies, things have played out in the way that Bayh-Dole intended by ushering in a so-called “genetic goldrush.” CRISPR has transformed from a research tool into an object of private investment and commercial returns. As of 2022, the US Patent Office was dealing with roughly 11,000 families of patents on CRISPR-related technology, with universities holding the majority of patents issued to date (Contreras & Sherkow, Reference Contreras and Sherkow2017; Ledford, Reference Ledford2022). A high-profile and publicly contentious patent dispute over the original CRISPR-Cas9 process—between the University of California, Berkley (UC Berkeley) and the Broad Institute of MIT and Harvard—has dominated the patent landscape. In 2012, Jennifer Doudna, Emmanuelle Charpentier, and their colleagues outlined how CRISPR-Cas9 could be used to precisely isolate and cut DNA. In 2013, Feng Zheng and his colleagues showed how it could be adapted to edit DNA in eukaryotic cells in plants and humans. Doudna and Charpentier first filed a patent application for the technology in May 2012, but Zhang filed an expedited patent application for CRISPR 7 months later, and he was the first to receive the patent in 2014, leading to an ongoing legal battle over who should own the rights to the technology. The principal issue concerns whether Doudna and Charpentier were the first to invent CRISPR-Cas9 or whether Zhang’s team had made significant enough technological modifications to warrant a separate patent. As of 2022, the Broad Institute appears victorious based on a ruling from an arm of the US Patent Office, but the University of California has appealed the decision. Scientists and the elite academic institutions that employ them will undoubtedly continue to pay armies of lawyers to represent their intellectual property rights until they have exhausted all of their options because the stakes are high. Whoever owns the commercial intellectual property rights on CRISPR-Cas9 stands to make a ton of money—somewhere in the ballpark of hundreds of millions (if not billions) in royalties (Paradise, Reference Paradise2023). In addition, the patent holder will also exert significant control over who gets to use CRISPR and how through its licensing agreements, which effectively puts patent holders in a position to govern and regulate technology via constraints attached to licenses (Feeney, Cockbain, & Sterckx, Reference Feeney, Cockbain and Sterckx2021).

Despite the ongoing legal battles over patents and licensing, biotechnology startups using CRISPR-Cas9 have launched, and university scientists have been at the center of the action (Brinegar et al., Reference Brinegar, Yetisen, Choi, Vallillo, Ruiz-Esparza, Prabhakaer, Khademhosseini and Yun2017). Jennifer Doudna, at UC Berkeley, co-founded Caribou Biosciences. Emmanuelle Charpentier, then at the University of Vienna, co-founded CRISPR Therapeutics. Feng Zheng, at MIT, co-founded Editas Medicine. Venture capitalists quickly jumped on board. In 2016, CRISPR Therapeutics and Editas Medicine went public. By 2021, CRISPR Therapeutics alone was valued at $13.8 billion. Companies working with CRISPR-Cas9 have taken out licenses with either the University of California or the Broad Institute, and this decision could ultimately be “a ‘monster loss’ for companies that bet on the wrong horse” (Cohen, Reference Cohen2022). Regardless of the end result, the takeaway point is that many university scientists working with CRISPR have a lot of money riding on the outcome of their research.

The commercialization of CRISPR technologies also demonstrates the increasing role of academic institutions in business ventures. Not only do many individual faculty members now have personal financial stakes in their research, but some universities are even using their endowments to invest in their professors’ startup ventures. For example, the University of California created UC Ventures, a fund to pursue investments in home-grown startups that developed out of research-fueled enterprises across the system’s campuses. The NYU Innovation Venture Fund, the University of Texas System’s Horizon Fund, the University of Cincinnati’s Lab2Market Program, and the UChicago Startup Investment Program are additional examples of initiatives specifically designed to invest endowment capital in startups led by university faculty. These types of arrangements can create institutional financial conflicts of interest. Such conflicts occur when an institution has a financial interest that may impact or appear to impact research activities on campus (Contreras & Rinehart, Reference Contreras, Rinehart and Rooksby2019).

When improperly managed, these types of institutional conflicts can have dire consequences. The tragic death of Jesse Gelsinger, a participant in a gene therapy trial at the University of Pennsylvania School of Medicine, is a cautionary tale in this regard. Both James Wilson, the individual faculty member who served as principal investigator for the gene transfer trial, and the University of Pennsylvania had financial conflicts of interest as they held patents on processes used in the clinical trial. Wilson was a founder of the biotechnology company Genovo (whose product was being tested), and he held stock with an estimated worth of around $30 million in the company. The university had also entered into a sponsored research agreement with Genovo, which provided Penn with $21 million in research over 5 years in exchange for the exclusive right to license any technologies developed out of that research. Jesse Gelsinger agreed to participate in a drug trial to treat his genetic liver disease even though he had been managing his condition with a controlled diet. After injecting his liver with an adenoviral vector carrying a corrected gene for his genetic disease, Jesse suffered a massive immune response, leading to organ failure and death. Afterward, the Gelsinger family sued Wilson and the university, alleging that Jesse was not adequately informed of the financial interests of the principal investigator and the institution when he volunteered (Kim, Reference Kim2017).

In sum, financial conflicts of interest raise difficult questions for democratic theorists concerned with bioethics, science policy, and governance of new technologies. What happens to scientists’ credibility as advisors when they have a financial stake in the outcome of their research? Can leading scientists researching CRISPR technologies offer disinterested advice in the public interest—on matters as important as whether or not we should permit germline gene editing to alter the human gene pool—when university research is so intertwined with profit-driven companies, which are accountable to their shareholders rather than society at large? What happens when the universities themselves are the shareholders?

Decades ago, science and technology scholar Sheldon Krimsky (Reference Krimsky2003) anticipated that the commercialization of academic science would have a corrosive effect on scientists’ capacities to offer sound policy advice as reliable, independent, disinterested parties. As Krimsky cautioned,

The rapid growth of entrepreneurship in universities has resulted in an unprecedented rise in conflicts of interest, specifically in areas sensitive to public concern. Conflicts of interest among scientists has been linked to research bias as well as the loss of a socially valuable ethical norm— disinterestedness— among academic researchers.

(2003, p. 7)

The scientific community may assume that the state of mind of the scientist is not prone to the same influences that are known to corrupt the behavior of elected officials or journalists, but even people with the highest degree of integrity can be influenced by money, and scientists are not immune. Research on the “funding effect” consistently finds that the results of commercially funded research are more likely to be biased toward the interests of the corporate sponsor (Ahn et al., Reference Ahn, Woodbridge, Abraham, Saba, Kornstein, Madden, Boscardin and Keyhani2017; Fabbri et al., Reference Fabbri, Lai, Grundy and Bero2018; Lundh et al., Reference Lundh, Lexchin, Mintzes, Schroll and Bero2017). Bias often manifests in subconscious ways, and it does not necessarily involve intentional misconduct, such as blatantly falsifying data. Money can directly and indirectly influence the selection of research problems, experimental design, and data analysis and interpretation in small ways that tip the results in favor of the sponsor.

Krimsky concedes that the ethical norm of disinterestedness is always aspirational, since perfect neutrality and objectivity are always out of reach. We should expect university scientists to be passionate about their work, because many see their choice of profession as a calling. In addition, the desire for professional advancement, the drive for public recognition and awards, and the motivation to win the race to scientific discovery are inextricably part of the enterprise of science. Krimsky is untroubled by these intrinsic interests since they cannot be distilled out of the practice of science. However, financial interests, which are external to the practice of science, are a different matter altogether, making it possible for the norm of disinterestedness to be supplanted by the reality of having multi-vested interests, wherein a single faculty member may also be a corporate consultant, a patent holder, and the founder of a startup company (Krimsky, Reference Krimsky2006).

How might Krimsky’s insights contribute to ongoing debates about the meaning and value of disinterestedness in science? While some scholars see disinterestedness as an unrealistic and outdated norm that should be discarded altogether, others argue that we ought to modify its meaning rather than dismiss its value outright (Anderson et al., Reference Anderson, Ronning, DeVries and Martinson2010; Djørup & Kappel, Reference Djørup and Kappel2013; MacFarlane, Reference Macfarlane2023). The distinction between intrinsic/extrinsic interests encourages us to adopt the latter approach. By acknowledging the reality that all knowledge production is inevitably and unavoidably affected by intrinsic interests, the norm of disinterestedness does not require individual scientists to be perfectly neutral, objective, indifferent, or even detached from their work. Scientists can invest in what they are doing. Also, even though they ought to try to reflect on their own personal biases to keep them in check, the reality is that nobody (not even an enlightened academic) can attain a universal standpoint that transcends the particularities of their socially located perspective. Pure objectivity is an illusion. Nevertheless, the threats wrought by external interests, which affect the practice of science on a structural level, suggest that the norm of disinterestedness still has political purchase when applied to scientific environments and communities. When thought of this way, disinterestedness serves as a useful aspirational norm to guide science as an institution. It places expectations on academia to develop policies and practices that encourage its members to remain primarily motivated to conduct research for the sake of knowledge and discovery, rather than financial gain and riches. In this way, the norm of disinterestedness helps to safeguard the reliability and integrity of science as an institution.

The need to protect and promote the reliability and integrity of science is more important than ever. As recent polling data indicates, public trust in science is rapidly declining. In 2022, only 29% of Americans reported having a great deal of confidence in medical scientists to act in the interests of the public (Kennedy, Tyson, & Funk, Reference Kennedy, Tyson and Funk2022). Although many factors contribute to this trend, high-profile corruption cases damage credibility. For instance, in 2018, the New York Times revealed that the chief medical officer of Memorial Sloan Kettering Cancer Center, José Baselga, failed to disclose millions of dollars of industry funding in research published at prestigious academic journals, one of which he also served as editor-in-chief (Ornstein & Thomas, Reference Ornstein and Thomas2018). It was later discovered that Baselga framed the results of a Roche-sponsored clinical trial as positive when presenting research at a conference (even though the other scientists on the panel found the results disappointing) without disclosing that Roche had paid him $3 million in consulting fees (Johnson & Brumbaugh, Reference Johnson and Brumbaugh2022). Beyond examples of blatant misconduct, the intermingling of academia with industry often gives the appearance of impropriety, and this impression alone corrodes public trust. Why trust the scientific community when it looks like science is for sale and scientific experts are hired hands?

Recommendations for reform

In this moment of political uncertainty about how to govern CRISPR research, it is crucial that people are able to trust and value scientists’ direct input into the debate. However, the corporate grip on academic research has compromised the integrity of science advising and fueled public distrust. What should we do about it? Before offering recommendations for reform, I concede that the trend toward academia-industry collaboration seems irreversible in the neoliberal political climate. Academic science and industry have become so deeply interdependent that it is no longer a realistic option to disentangle them altogether, especially in biomedical research. Besides, collaboration often has many benefits (Boccanfuso, Reference Boccanfuso2010; Ollif & Patel, Reference Ollif and Patel2025; Savage, Reference Savage2017). Cooperation can be mutually beneficial insofar as industry sponsorships provide funding for graduate students, research assistants, and lab equipment, while corporations benefit from access to experts and laboratory facilities in turn. Joint research and development projects can also shorten the time period between development and commercialization, thereby accelerating the development of new drugs and medical advances. Moreover, workforce pipelines often develop from close collaboration, allowing firms to recruit qualified students for jobs while providing students with a clear path toward employment after graduation. The successes of Silicon Valley and the Research Triangle exemplify the upsides of corporate-university collaboration. Even though divorce may be unrealistic and imprudent, a healthier marriage with clearer boundaries is within reach. The relevant question is: how can democracies effectively protect academic science from the dangers of egregious commercialization and corporate meddling without eliminating private money from academia altogether?

Current mechanisms to keep financial conflicts of interest at bay are a good start, but they are not getting the job done. By and large, the federal government has set some basic parameters for COI policies for principal investigators, but it has left universities to work out the specifics of management, compliance, and enforcement themselves. Typically, individual research faculty are supposed to disclose financial COIs (such as gifts, grant rewards, consulting fees, stock holdings, and so on) to their employers. Next, a review committee decides whether to prohibit or permit such conflicts. In some cases, the committee will prescribe additional measures following approval, such as creating a management plan using independent monitors on a project, mandating public disclosure at speaking engagements, and so on. In addition, individual researchers are usually required to disclose financial COIs to peer-reviewed journals when they submit research for publication and to professional associations when they are involved in the development of standards and guidelines.

In the absence of standardized rules, policies and procedures vary widely from campus to campus and journal to journal, and many rules are weakly and inconsistently enforced. Competitive pressures have also created a dangerous race to the bottom in standards. Research universities benefit from the influx of corporate money, so they have little incentive to tighten regulations or conduct thorough investigations of misconduct allegations for fear of jeopardizing millions in revenue. Leaving universities in charge of managing their professors’ COIs essentially amounts to “leaving the fox to guard the henhouse” (Washburn, Reference Washburn2005, p. 234).

In addition, not all universities have developed policies for dealing with their institutional COIs. Why? No federal policies require them to do so. Universities must develop policies to manage individual researchers’ conflicts to be eligible for government grants, but no comparable requirement exists for institutions. The result is that only about a quarter of the top 100 US academic research institutions (ranked by total research funding) actually have institutional COI policies in place (Resnik et al., Reference Resnik, Ariansen, Jamal and Kissling2016). University leadership is reluctant to scrutinize various corporate gifts, stock investments, or corporate contracts that bring considerable money and prestige to their institution. This means that an outside entity—such as a government agency or perhaps even accreditation organizations—may need to force their hand.

Even if COI policies were standardized and better enforced, disclosure does little to resolve or eliminate the underlying conflict requiring transparency in the first place. What else can be done? As Daniel S. Greenberg (Reference Greenberg2001) contends, “No single disinfectant can cope with the contamination of academic scientific integrity,” so a multifaceted regulatory framework is needed to minimize COIs, protect academic disinterestedness, and restore public trust in the integrity of scientific information (p. 473). The following proposals are by no means exhaustive, but they gesture at the development of a multilevel research integrity nexus, which moves beyond a focus on individual misconduct to address broader structural problems in the funding and organization of academic science.

First, investing more state and federal taxpayer dollars into biomedical research could curb excessive reliance on corporate funding in the first place. All money, regardless of the source, has the potential to corrupt the integrity of science. Researchers may be tempted to “torture the data until it confesses,” so to speak, whether one seeks funding from the government, a private company, or a philanthropic foundation. Nevertheless, important distinctions can be drawn between the kind of influence government and corporations tend to wield. For one thing, state and federal research funding is determined through a democratic decision-making process, while corporate funding is dictated by what is best for the company’s bottom line. This means that government financing is more likely to encourage basic research and fund development projects that better serve the public good. By contrast, corporations—which are motivated by profit—are more likely to fund research and product development for common chronic conditions (with a large consumer base) for people in the developed world who are wealthy enough to pay for their products (Yegros-Yegros et al., Reference Yegros-Yegros2020). Hence, the federal government is uniquely suited to fund basic research and/or research projects that market incentives tend to ignore (Mandt, Seetharam, & Cheng, Reference Mandt, Seetharam and Cheng2020). The public benefit of basic research is unpredictable, indirect, and long term—there is never any guarantee that blue sky research will lead to impactful breakthroughs with “real-world” applications. However, no scientific advancement ever happened without it. After all, basic research on bacterial immunity to viruses eventually paved the way to using CRISPR as a gene editing tool.

Second, legal requirements to make funding sources and endowment investments public should be implemented and enforced. Corporations can disguise their research funding as tax-deductible donations to university foundations. Although some states require the donations to be a matter of public record, many states do not. A lack of transparency makes it nearly impossible to know exactly how much corporate money is flowing into university research in the United States, even at public institutions (McClusky, Reference McClusky2017; Morgan, Reference Morgan2023). University endowment investment portfolios are also not a matter of public record in most states, so it is hard to know whether a university conducting human subjects research has an institutional COI with a biotechnology company. To track funding sources, the University of Michigan maintains a public database of how its research is funded, and other universities could be required to follow suit. Similarly, the University of Texas system is required under state law to publish its holdings every year. A federal law could require all universities to do the same.

Third, another way to curb financial COIs is to place limits on an academic institution’s and/or an individual professor’s equity stakes. This approach has been adopted by a few select universities that have voluntarily chosen to cap equity holdings in companies sponsoring their research. For example, Washington University in St. Louis has an investment policy stipulating that the university endowment’s equity ownership cannot exceed 5% of shares in a faculty startup. For the most part, however, most universities do not have policies imposing equity caps on individual faculty members. More commonly, individual faculty members are required to disclose and manage their financial COIs by putting their stock in a blind escrow, for example. While equity caps may not be uniformly appropriate in all cases, they may be a desirable tool to use in research involving human subjects and clinical trials since the stakes of corruption are so high.

Fourth, legal contracts between universities and corporations should be structured to protect academic research from corporate meddling. In too many agreements, corporate sponsors are authorized to directly influence research design and have a final say over research questions, materials, and methodologies to be used in exchange for their funding. In addition, some contracts contain confidentiality agreements with restrictive clauses on publication, even allowing corporate sponsors to delay publications or bury results that are unfavorable to their products (McClusky, Reference McClusky2017). These types of arrangements may be appropriate for industry scientists on corporate payrolls, but they are unacceptable for academic scientists. Restrictions on publication repress academic freedom and violate sacrosanct norms of openness and information sharing. If academics cannot freely pursue and publish science, wherever it might lead, the soul of higher education is compromised.

Finally, individual faculty members should disclose financial COIs in detail. One idea on how to do this more efficiently is to have the World Health Organization fund and maintain a publicly available registry on COIs for individual researchers (Christian, Reference Christian2022). Better disclosure practices are needed in the political arena as well. When faculty members testify before Congress, witness disclosure forms require that public funding be disclosed, but corporate funding is exempt. Similar exemptions often exist for academic scientists serving on scientific advisory committees. Under the Federal Advisory Committee Act (1972), Congress established the categories of “special government employee” and “representatives” and made the COI rules for such employees less restrictive than for regular federal government employees in order to overcome obstacles in hiring outside experts. These loopholes must be closed. An independent and impartial advisory committee system is essential to ensuring that elected officials have access to diverse viewpoints from disinterested experts who can inform evidence-based policymaking.

Conclusion

The genetic revolution is moving at a breakneck speed. Over the last decade, new CRISPR–Cas9 gene editing tools have further accelerated the pace of discovery. In the current moment, democracies like the United States face tough choices about whether scientists should be allowed to modify the human germline in basic research and/or clinical applications (and, if so, with what limits on research and/or medical practice). Collective anxiety about these technologies and their potential uses is high. Enthusiasts are hopeful that gene editing can be a means to bring about any number of positive changes, including the eradication of genetic diseases and the enhancement of human intellectual and physical capacities. Critics fear that various negative repercussions could follow if scientists do not shut Pandora’s box on germline gene editing as quickly as possible. Among these repercussions is the possibility that social inequality will worsen following the creation of a genetic masterclass of enhanced “haves” and an underclass of unenhanced “have-nots.” Hence, while some see CRISPR as a tool of salvation, others see it as a tool of destruction. In any case, although scientists should not monopolize the debate, they nevertheless have a vital role to play in it. As I hope to have shown, this role has been compromised by the growing commercialization of university-based CRISPR research, which has created powerful incentives for academic scientists to act as entrepreneurs rather than disinterested advisors. If we expect university scientists to put their best foot forward and advise us on the potential risks and rewards of these technologies as disinterested parties concerned with the public interest, structural political changes in the funding and organization of academic science need to be made.

Footnotes

1 CRISPR–Cas9 was developed from bacterial immunity to viruses. It is a two-component system that uses a DNA-cutting enzyme (the Cas9 part) to guide RNA molecules that recognize specific human DNA sequences (the CRISPR part, which stands for Clustered Regularly Interspaced Short Palindromic Repeats). Jennifer Doudna, Emmanuelle Charpentier, and their colleagues published the first article on how CRISPR could be used as a tool for gene editing in Science in 2012. See Jinek et al. (Reference Jinek2012).

2 From Plato’s perspective, democracy was a dangerous system in that it permitted the unenlightened masses, rather than the wisest and most knowledgeable citizens, to make decisions. Speaking to Glaucon in Book VI, Socrates declared, “Unless the philosophers rule as kings or those now called kings and chiefs adequately philosophize, and political power and philosophy coincide in the same place… there is no rest for ills from the cities” (1991 [375 BC], 473c-d). Socrates thus tasked moral experts, the so-called “philosopher-kings,” with managing Kallipolis.

3 The Supreme Court’s distinction between human-made and naturally occurring living organisms was clarified in Association for Molecular Pathology et al. versus Myriad Genetics (2013). Myriad argued that once a gene (like BRCA1) could be isolated, it could be patented. By patenting the genes, Myriad had exclusive control over diagnostic testing. The Supreme Court ruled that naturally occurring gene sequences are not patent eligible under the Patent Act. Only modified genes created as new products in a lab are patentable.

References

Abate, T. (2001). Scientists’ ‘publish or perish’ credo now ‘patent and profit’. San Francisco Chronicle. https://www.sfgate.com/business/article/Scientists-publish-or-perish-credo-now-patent-2891077.php.Google Scholar
Abdulla, S., & Corrigan, J. (2022). Bayh-dole patent trends: Charting developments in government-funded intellectual property through time. Center for Security and Emerging Technology. https://cset.georgetown.edu/publication/bayh-dole-patent-trends/.Google Scholar
Ahn, R., Woodbridge, A., Abraham, A., Saba, S., Kornstein, D., Madden, E., Boscardin, J., & Keyhani, S. (2017). Financial ties of principal investigators and randomized controlled trial outcomes: Cross sectional study. BMJ, 356, 19.Google ScholarPubMed
Anderson, M., Ronning, E. A., DeVries, R., & Martinson, B.C. (2010). Extending the Mertonian norms: Scientists’ subscription to research norms. The Journal of Higher Education, 81(3), 366393.10.1080/00221546.2010.11779057CrossRefGoogle Scholar
Baltimore, D., Berg, P., Botchan, M., Carroll, D., Charo, R. A., Church, G., Corn, J. E., Daley, G. Q., Doudna, J. A., Fenner, M., Greely, H. T., Jinek, M., Martin, G. S., Penhoet, E., Puck, J., Sterberg, S. H., Weissman, J. S., & Yamamoto, K. R. (2015). A prudent path forward for germline engineering and germline gene modification. Science, 348(623), 3638.10.1126/science.aab1028CrossRefGoogle ScholarPubMed
Baylis, F. (2019). Altered inheritance: CRISPR and the ethics of human genome editing. Harvard University Press.10.4159/9780674241954CrossRefGoogle Scholar
Boccanfuso, A. (2010). Why university-industry partnerships matter. Science Translational Medicine, 2(51), 14.10.1126/scitranslmed.3001066CrossRefGoogle ScholarPubMed
Brennan, J. (2016). Against democracy. Princeton University Press.Google Scholar
Brinegar, K., Yetisen, A. K., Choi, S., Vallillo, E., Ruiz-Esparza, G. U., Prabhakaer, A. M., Khademhosseini, A., & Yun, S-H. (2017). The commercialization of genome-editing technologies. Critical Reviews in Biotechnology, 37(7), 924932.10.1080/07388551.2016.1271768CrossRefGoogle ScholarPubMed
Brown, M. (2009). Science in democracy: Expertise, institutions, and representation. MIT Press.10.7551/mitpress/9780262013246.001.0001CrossRefGoogle Scholar
Bush, V. (1945). Science: The endless frontier. National Science Foundation. https://nsf-gov-resources.nsf.gov/2023-04/EndlessFrontier75th_w.pdf.Google Scholar
Christian, A. (2022). Addressing conflicts of interest and conflicts of commitment in public advocacy and policy making on CRISPR/Cas-based human genome editing. Frontiers in Research Metrics and Analytics, 7, 116.10.3389/frma.2022.775336CrossRefGoogle ScholarPubMed
Cobb, M. (2022). As gods: A moral history of the genetic age. Basic Books.Google Scholar
Cohen, J. (2022). New CRISPR patent hearing continues high stakes legal battle. Science Insider. https://www.science.org/content/article/new-crispr-patent-hearing-continues-high-stakes-legal-battle.Google Scholar
Contreras, J. & Rinehart, M. (2019). Conflicts of interest and academic research. In Rooksby, J. (Ed.), Research handbook on intellectual property and technology transfer (pp. 143165). Edward Elgar Publishing.Google Scholar
Contreras, J., & Sherkow, J. (2017). CRISPR, surrogate licensing, and scientific discovery. Science, 355(6326), 698700.10.1126/science.aal4222CrossRefGoogle ScholarPubMed
Dewey, J. (1910). Science as subject-matter and as method. Science, 31(787), 121127.10.1126/science.31.787.121CrossRefGoogle ScholarPubMed
Dewey, J. (1927). The public and its problems. Swallow Press.Google Scholar
Djørup, S., & Kappel, K. (2013). The norm of disinterestedness in science; a restorative analysis. De Gruyter, 14(2), 153175.Google Scholar
Douglas, H. (2021). The role of scientific expertise in democracy. In Hannon, M. & de Ridder, J. (Eds.), The Routledge handbook of political epistemology (pp. 435445). Routledge.10.4324/9780429326769-52CrossRefGoogle Scholar
Dryzek, J. S., Nicol, D., Niemeyer, S., Pemberton, S., Curato, N., Bachtiger, A., Batterham, P., Bedsted, B., Burall, S., Burgess, M., Burgio, G., Castelfranchi, Y., Chneiweiss, H., Church, G., Rossley, M. C., De Vries, J., Farooque, M., Hammond, M., He, B., Mendonca, R., Merchant, J., Middleton, A., Rasko, J. E. J., Van Hoyweghen, I., & Vergne, A. (2020). Global citizen deliberation on genome editing. Science, 369(6510), 14351437.10.1126/science.abb5931CrossRefGoogle ScholarPubMed
Fabbri, A., Lai, A., Grundy, Q., & Bero, L. A. (2018). The influence of industry sponsorship on the research agenda: A scoping review. American Journal of Public Health, 108(11), e9e16.10.2105/AJPH.2018.304677CrossRefGoogle ScholarPubMed
Feeney, O., Cockbain, J., & Sterckx, S. (2021). Ethics, patents, and genome editing: A critical assessment of the three options of technology governance. Frontiers in Political Science, 3, 110.10.3389/fpos.2021.731505CrossRefGoogle Scholar
Greely, H. (2021). CRISPR people: The science and ethics of editing humans. MIT Press.10.7551/mitpress/13492.001.0001CrossRefGoogle Scholar
Greenberg, D. (2001). Science, money, and politics: Political triumph and ethical erosion. The University of Chicago Press.Google Scholar
Habermas, J. (1971). Toward a rational society. Heinemann.Google Scholar
Hurlbut, B. J. (2025). Taking responsibility: Asilomar and its legacy. Science, 387(6733), 468472.10.1126/science.adv3132CrossRefGoogle ScholarPubMed
Isaacson, W. (2020). This year’s Nobel prize in chemistry honors a revolution. New York Times. https://www.nytimes.com/2020/10/07/opinion/nobel-prize-chemistry-2020-doudna-charpentier.htmlGoogle Scholar
Jasanoff, S. (1990). The fifth branch: Science advisors as policymakers. Harvard University Press.Google Scholar
Jasanoff, S. (2019). Can science make sense of life? Polity Press.Google Scholar
Jinek, M., et al. (2012). A programmable dual-RNA-guided DNA endonuclease in adaptive bacterial immunity. Science, 337(6096), 816821.10.1126/science.1225829CrossRefGoogle ScholarPubMed
Johnson, J., & Brumbaugh, B. (2022). Conflict of interest in biomedical research and clinical practice. The Hastings Center. https://www.thehastingscenter.org/briefingbook/conflict-of-interest-in-biomedical-research/.Google Scholar
Kennedy, B., Tyson, A., & and Funk, C. (2022). Americans’ trust in science, other groups declines. Pew Research Center. https://www.pewresearch.org/wp-content/uploads/sites/20/2022/02/PS_2022.02.15_trust-declines_REPORT.pdf.Google Scholar
Kenney, M. (1986). Biotechnology: The university-industrial complex. Yale University Press.Google Scholar
Kim, J. (2017). Legislative issues in disclosing financial conflicts of interest to participants in biomedical research: Effectiveness and methodology. Journal of Korean Medical Science, 32(12), 19101916.10.3346/jkms.2017.32.12.1910CrossRefGoogle ScholarPubMed
Krimsky, S. (1991). Biotechnics and society: The rise of industrial genetics. Rowman and Littlefield.Google Scholar
Krimsky, S. (2003). Science in the private interest: Has the lure of profits corrupted biomedical research? Rowman and Littlefield.Google Scholar
Krimsky, S. (2006). Autonomy, disinterest, and entrepreneurial science. Society, 43(4), 2229.10.1007/BF02687531CrossRefGoogle Scholar
Ledford, H. (2022). Major CRISPR patent decision won’t end tangled dispute. Nature, 603, 373374.10.1038/d41586-022-00629-yCrossRefGoogle ScholarPubMed
Lundh, A., Lexchin, J., Mintzes, B., Schroll, J. B., & Bero, L. (2017). Industry sponsorship and research outcome. Cochrane Database of Systematic Reviews, 2(8), 1140.Google ScholarPubMed
Macfarlane, B. (2023). The DECAY of Merton’s scientific norms and the new academic ethos. Oxford Review of Education, 50(4), 116.Google Scholar
Mandt, R., Seetharam, K., & Cheng, C. (2020). Federal R&D funding: The bedrock of national innovation. MIT Science Policy Review, 1, 4454.10.38105/spr.n463z4t1u8CrossRefGoogle Scholar
Matthews, K., & Morali, D. (2022). Can we do that here? An analysis of US federal and state policies guiding human embryo and embryoid research. Journal of Law and the Biosciences, 9(1), 124.10.1093/jlb/lsac014CrossRefGoogle ScholarPubMed
McClusky, M. (2017). Public universities get an education in private industry. The Atlantic. https://www.theatlantic.com/education/archive/2017/04/public-universities-get-an-education-in-private-industry/521379/Google Scholar
Mervis, J. (2017). U.S. government share of basic research funding falls below 50%. Science Insider. https://www.science.org/content/article/data-check-us-government-share-basic-research-funding-falls-below-50Google Scholar
Moore, A. (2016). Deliberative elitism: Distributed deliberation and the organization of epistemic inequality. Critical Policy Studies, 10(2), 191208.10.1080/19460171.2016.1165126CrossRefGoogle Scholar
Moore, A. (2017). Critical elitism: Deliberation, democracy, and the problem of expertise. Cambridge University Press.10.1017/9781108159906CrossRefGoogle Scholar
Morgan, H. (2023). Reducing corporate influence on university research in America. Policy Futures in Education, 21(8), 817831.10.1177/14782103221102265CrossRefGoogle Scholar
Ollif, K., & Patel, J. (2025). Current U.S. university-industry partnerships: Realignment and opportunity. The National Academies. https://www.nationalacademies.org/documents/embed/link/LF2255DA3DD1C41C0A42D3BEF0989ACAECE3053A6A9B/file/D6C84643C00154ECD928B6B61B491896F6D965314ABA?noSaveAs=1.Google Scholar
Ornstein, C., & Thomas, K. (2018). Top cancer researcher fails to disclose corporate financial ties to major research journals. New York Times. https://www.nytimes.com/2018/09/08/health/jose-baselga-cancer-memorial-sloan-kettering.html.Google Scholar
Oshinsky, D. (2005). Polio: An American story. Oxford University Press.Google Scholar
Pamuk, Z. (2021). Politics and expertise: How to use science in a democratic society. Princeton University Press.Google Scholar
Paradise, J. (2023). The CRISPR patent ruling and implications for medicine. The Journal of the American Medical Association, 329(6), 461462.10.1001/jama.2022.24986CrossRefGoogle ScholarPubMed
Parthasarathy, S. (2015). Governance lessons for CRISPR/Cas9 from the missed opportunities of Asilomar. Ethics in Biology, Engineering & Medicine, 6(3–4), 305312.10.1615/EthicsBiologyEngMed.2016016470CrossRefGoogle Scholar
Pielke, J. R. (2007). The honest broker: Making sense of science in policy and politics. Cambridge University Press.10.1017/CBO9780511818110CrossRefGoogle Scholar
Pilaar, J. (2020). The laws of public higher education retrenchment. New York University Journal of Legislation and Public Policy, 23(1), 159226.Google Scholar
Plato. (1991). The republic of Plato (2nd ed). A. Bloom (Trans.). Basic Books.Google Scholar
Polanyi, M. (1962). The republic of science: Its political and economic theory. Minerva, 1, 5473.10.1007/BF01101453CrossRefGoogle Scholar
Resnik, D., Ariansen, J. L., Jamal, J., & Kissling, G. (2016). Institutional conflict of interest policies at U.S. academic research institutions. Academic Medicine, 91(2), 242246.10.1097/ACM.0000000000000980CrossRefGoogle ScholarPubMed
Rudy, A., Coppin, D., Konefal, J., Shaw, B. T., Eyck, T. T., Harris, C., & Busch, L. (2007). Universities in the age of corporate science: The UC Berkeley-Novartis controversy. Temple University Press.Google Scholar
Savage, N. (2017). Industry links boost research output. Nature, 552, S11S13.10.1038/d41586-017-07422-2CrossRefGoogle ScholarPubMed
Sheinerman, N. (2023). Public engagement through inclusive deliberation: The human genome international commission and citizens’ juries. The American Journal of Bioethics, 23(12), 6676.10.1080/15265161.2022.2146786CrossRefGoogle Scholar
Stevens, T., & Newman, S. (2019). The biotech juggernaut: Hope, hype, and hidden agendas of entrepreneurial bioscience. Routledge.10.4324/9781315173269CrossRefGoogle Scholar
Thacker, P. (2022). Stealing from the tobacco playbook, fossil fuel companies pour money into elite American universities. The British Medical Journal, 378(2095), 14.Google ScholarPubMed
The National Academies Press. (2017). Human genome editing: Science, ethics, and governance. https://nap.nationalacademies.org/catalog/24623/human-genome-editing-science-ethics-and-governance.Google Scholar
The National Academies Press. (2020). Heritable human genome editing. https://nap.nationalacademies.org/catalog/25665/heritable-human-genome-editing.Google Scholar
Thiel, D. (2021). A CRISPR view of human genome editing in the 21st century. University of Michigan. https://deepblue.lib.umich.edu/handle/2027.42/169935.Google Scholar
Washburn, J. (2005). University, Inc. Basic Books.Google Scholar
Yegros-Yegros, A., et al. (2020). Exploring why global health needs are unmet by research efforts: The potential influences of geography, industry and publication incentives. Health Research Policy and Systems, 8(47), 114.Google Scholar
Yi, D. (2015). The recombinant university: Genetic engineering and the emergence of Stanford biotechnology. The University of Chicago Press.10.7208/chicago/9780226216119.001.0001CrossRefGoogle Scholar