Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-ttngx Total loading time: 0 Render date: 2024-05-18T09:13:34.745Z Has data issue: false hasContentIssue false

Part I - Questions of Data Governance for Data from Digital Home Health Products

Published online by Cambridge University Press:  25 April 2024

I. Glenn Cohen
Affiliation:
Harvard Law School, Massachusetts
Daniel B. Kramer
Affiliation:
Harvard Medical School, Massachusetts
Julia Adler-Milstein
Affiliation:
University of California, San Francisco
Carmel Shachar
Affiliation:
Harvard Law School, Massachusetts

Summary

Type
Chapter
Information
Digital Health Care outside of Traditional Clinical Settings
Ethical, Legal, and Regulatory Challenges and Opportunities
, pp. 11 - 60
Publisher: Cambridge University Press
Print publication year: 2024
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Introduction

As Barbara J. Evans quotes from Daniel Solove in her chapter in this part, “[n]ot all privacy problems are the same.” Digital home health products are exciting because they use the massive amounts of data that can be generated within the home to monitor, address, and improve our health. But this powerful leveraging of data means that digital home health products raise unique privacy problems, unlike those raised by most other medical devices. Not only are these products harnessing an ocean of data about their users, but they are also uniquely drawing that data from the most sacrosanct of settings, the home. This only heightens the importance of intentional, thoughtful, comprehensive, and well-designed data governance.

The contributions in this part wrestle with questions of data governance, informed by the heightened sensitivity of recording from the home. Each chapter focuses on different questions regarding data governance. In that sense, each contribution touches on “part of the elephant.” By reading these chapters, the reader may be able to see the full elephant – in this case, the challenges and opportunities inherent in data governance for digital home health products. The answers to these questions can help articulate an overall vision of data governance for data coming out of digital home health products.

Barbara J. Evans opens this part with her chapter, “In the Medical Privacy of One’s Own Home: Four Faces of Privacy in Digital Home Health Care.” Evans’s contribution challenges the reader to deeply engage with the concept of privacy, especially as it is applied to digital home health products. She argues that digital home health products are truly different in kind from other medical devices, in part because data handlers may not have fiduciary duties limiting their use of data gleaned from these products. State legislation, particularly designed to provide individual control over information, she argues, is the answer to how we should govern data from digital home health products.

Charles Duan and Christopher J. Morten focus on the patients/users of digital home health products in their chapter, “Patient Access to Health Device Data: Toward a Legal Framework.” The driving question of their work is, “Should patients have access to health device data and, if so, how should we facilitate that access?” Duan and Morten argue that intellectual property laws and policies often form a barrier to patient access to their own health data, and that a robust, administrable patients’ “bill of rights” is necessary. This legal framework should have incentives and requirements for device manufacturers to share data with users, more technical standards on how this data is shared and accessed, and guidelines for how data from multiple users may be aggregated.

Whereas Evans seeks to answer, “What should data governance of digital home health products look like in the coming years?” Danaja Fabcic Povse focuses on articulating what data governance structures already exist for this product category in her contribution “Challenges of Remote Patient Care Technologies under the General Data Protection Regulation: Preliminary Results of the TeNDER Project.” Povse answers the question, “What challenges does the GDPR [General Data Protection Regulation] pose for designers of remote patient care technologies (RCTs), and how can those questions be addressed in practice?” through the concrete experience of the TeNDER project, a Horizon 2020 funded work to empower patients with Alzheimer’s, Parkinson’s, and cardiovascular diseases, by helping them to monitor their health and manage their treatments. Povse highlights the legal challenges that developers of digital home health technologies face, including how to build consent for persons suffering from cognitive decline and the ideal terms of use for service providers working with external processors.

Jodyn Platt and Sharon Kardia focus on the question, “What lessons can be learned from newborn screening for the data governance of digital home health devices?” in their chapter, “Renegotiating the Social Contract for Use of Health Information: Lessons Learned from Newborn Screening and Implications for At-Home Digital Care.” Platt and Kardia argue that many of the data governance questions raised by digital home health products were also raised by the expansion of newborn screening bloodspot programs and the organizations tasked with stewarding these databases of health information. Social norms and expectations around consent, commercialization, and governance informed the evolution of the Michigan BioTrust, which holds all the newborn bloodspots for children born in Michigan. The lessons learned by BioTrust can help to answer the questions raised by digital home health developers, users, and regulations.

How should we use the data coming from digital home health products? This is, at first glance, a straightforward, simple question. But, as the authors of the chapters in this section demonstrate, this question is not so easily answered. By focusing on smaller questions – What access should patients have to their own data? Does the GDPR provide enough data governance guidance to developers? – we can begin to build a comprehensive vision of a data-governance structure that can protect users while also facilitating innovation.

1 In the Medical Privacy of One’s Own Home Four Faces of Privacy in Digital Home Health CareFootnote *

Barbara J. Evans
I Introduction

Digital tools to diagnose and treat patients in the home: The phrase hits several tripwires, each sounding its own privacy alarm. Invading the “castle” of a person’s home is one privacy tripwire.Footnote 1 Sustained digital surveillance of the individual is another. Anything to do with personal health information is still another. Each alarm calls attention to a different strand of privacy law, each with its own account of why privacy matters and how to protect it. No overarching conception of privacy leaps out, which calls to mind Daniel Solove’s remark that “the law has attempted to adhere to overarching conceptions of privacy that do not work for all problems. Not all privacy problems are the same.”Footnote 2

This chapter explores four faces of privacy: (1) Privacy of the home, which links privacy to the location where information is created or captured; (2) privacy as individual control over personal information, without regard to location, in an age of pervasive digital surveillance; (3) contextual privacy frameworks, such as medical privacy laws addressing the use and sharing of data in a specific context: clinical health care; and (4) content-based privacy, unmoored from location or context and, instead, tied to inherent data characteristics (e.g., sensitive data about health, sexual behavior, or paternity, versus nonsensitive data about food preferences). The hope here is to find a workable way to express what is special (or not) about digital tools for diagnosis and treatment in the home.

II The Privacy of the Home

An “interest in spatial privacy” feels violated as the home – the “quintessential place of privacy” – becomes a site of digital medical observation and surveillance.Footnote 3 Yet electronic home health monitoring originated decades ago, which invites the question of what has sparked sudden concern about digital home health privacy now.

Past experience with home diagnostics clarifies the privacy challenge today. In 1957, Dr. Norman J. Holter and his team developed an ambulatory electrocardiograph system, building on the 1890s string galvanometer for which Willem Einthoven won the 1924 Nobel Prize.Footnote 4 The resulting wearable device, known as a Holter monitor, records electrocardiographic signals as heart patients go about their routine activities at home and, since 1961, has been the backbone of cardiac rhythm detection and analysis outside the hospital.Footnote 5 Six decades of at-home use of this and similar devices have passed without notable privacy incidents.

There is a distinction that explains why traditional home diagnostics like Holter monitors were not controversial from a privacy standpoint, while today’s digital home health tools potentially are. Jack Balkin stresses that “certain kinds of information constitute matters of private concern” not because of details like the content or location, “but because of the social relationships that produce them.”Footnote 6 For example, an injured driver receiving care from an ambulance crew at the side of a road should not be filmed and displayed on the evening news – not because the person is in a private location (which a public highway is not), but because the person is in a medical treatment relationship at the time.Footnote 7 It is “relationships – relationships of trust and confidence – that governments may regulate in the interests of privacy.”Footnote 8

Traditional devices like Holter monitors are prescribed in a treatment relationship by a physician who refers the patient to a laboratory that fits the device and instructs the patient how to use it. After a set period of observation, the patient returns the device to the laboratory, which downloads and analyzes the data stored on the device and conveys the results to the ordering physician. Everyone touching the data is in a health care relationship, bound by a web of general health care laws and norms that place those who handle people’s health information under duties of confidentiality.Footnote 9

These duties flow less from privacy law than from general health care laws and norms predating modern concerns about information privacy. For example, state licensing statutes for health care professionals focus mainly on their competence but also set norms of confidentiality, enforceable through disciplinary sanctions and the potential loss of licensure.Footnote 10 Professional ethics standards, such as those of the American Medical Association, amplify the legally enforceable duties of confidentiality.Footnote 11 State medical records laws govern the collection, use, and retention of data from medical treatment encounters and specify procedures for sharing the records and disposing of or transferring them when a care relationship ends.Footnote 12 State courts enforce common law duties for health care providers to protect the confidential information they hold.Footnote 13

Jack Balkin’s first law of fair governance in an algorithmic society is that those who deploy data-dependent algorithms should be “information fiduciaries” with respect to their clients, customers, and end-users.Footnote 14 Traditional health care providers meet this requirement. The same is not always (or perhaps ever) true of the new generation of digital tools used to diagnose and treat patients at home. The purveyors of these devices include many new players – such as medical device manufacturers, software developers and vendors, and app developers – not subject to the confidentiality duties that the law imposes on health care professionals, clinics, and hospitals.

The relationships consumers will forge with providers of digital home health tools are still evolving but seem unlikely to resemble the relationships of trust seen in traditional health care settings. Responsibility for protecting the data generated and collected by digital home health devices defaults, in many instances, to vendor-drafted privacy policies and terms of service. Scott Peppet’s survey of twenty popular consumer sensor devices found these privacy protections to be weak, inconsistent, and ambiguous.Footnote 15

Nor is the privacy of the home a helpful legal concept here. As conceived in American jurisprudence, the privacy of the home is a Fourth Amendment protection against governmental intrusion to gather evidence for criminal proceedings.Footnote 16 This has little relevance to a private-sector medical device manufacturer or software vendor offering home diagnostic tools that gather personal health data that could be repurposed for research or a variety of other commercial uses that threaten users’ privacy. The Fourth Amendment occasionally might be helpful – for example, if the government seeks data from a home diagnostic device to refute a user’s alibi that she was at home at the time she stands accused of a crime at a different location. Unfortunately, this misses the vast majority of privacy concerns with at-home medical monitoring: Could identifiable health data leak to employers, creditors, and friends in ways that might stigmatize or embarrass the individual? Might data be diverted to unauthorized commercial uses that exploit, offend, or outrage the person the data describe? The Fourth Amendment leaves us on our own to solve such problems.

The privacy of the home enters this discussion more as a cultural expectation than as a legal reality. The home as a site of retreat and unobserved, selfhood-enhancing pursuits is a fairly recent innovation, reflecting architectural innovations such as hallways, which became common in the eighteenth century and eliminated the need for every member of the household to traverse one’s bedroom to get to their own.Footnote 17 The displacement of servants by nongossiping electrical appliances bolstered domestic privacy, as did the great relocation of work from the home to offices and factories late in the nineteenth century.Footnote 18 The privacy of the home is historically contingent. It may be evolving in response to COVID-19-inspired work-from-home practices but, at least for now, the cultural expectation of privacy at home remains strong.

This strong expectation does not translate into a strong framework of legal protections. Private parties admitted to one’s home are generally unbound by informational fiduciary duties and are free to divulge whatever they learn while there. As if modeled on a Fourth Amendment “consent search,” the host consents at the point when observers enter the home but, once there, they are free to use and share information they collect without further consent. The privacy of the home, in practice, is protected mainly by choosing one’s friends carefully and disinviting the indiscreet. The question is whether this same “let-the-host-beware” privacy scheme should extend to private actors whose digital home health tools we invite into our homes.

III Privacy as Individual Control over Identifiable Information

Many privacy theorists reject spatial metaphors, such as the privacy of the home, in favor of a view that privacy is a personal right for individuals to control data about themselves.Footnote 19 After the 1970s, this “control-over-information” privacy theory became the “leading paradigm on the Internet and in the real, or off-line world.”Footnote 20 It calls for people – without regard to where they or their information happen to be located – to receive notice of potential data uses and to be granted a right to approve or decline such uses.

This view is so widely held today that it enjoys a status resembling a religious belief or time-honored principle. Few people recall its surprisingly recent origin. In 1977, a Privacy Protection Study Commission formed under the Privacy Act of 1974 found that it was quite common to use people’s health data in biomedical research without consent and recommended that consent should be sought.Footnote 21 That recommendation was widely embraced by bioethicists and by the more recent Information Privacy Law Project on the ethics of data collection and use by retailers, lenders, and other nonmedical actors in modern “surveillance societies.”Footnote 22

Control-over-information theory has its critics. An obvious concern is that consent may be ill-informed as consumers hastily click through the privacy policies and terms of use that stand between them and a desired software tool. In a recent survey, 97 percent Americans recalled having been asked to agree to a company’s privacy policy, but only 9 percent indicated that they always read the underlying policy to which they are agreeing (and, frankly, 9 percent sounds optimistic).Footnote 23 Will people who consent to bring digital health devices into their homes carefully study the privacy policies to which they are consenting? It seems implausible.

A more damning critique is that consent, even when well-informed, does not actually protect privacy. A person who freely consents to broadcast a surgery or sexual encounter live over the Internet exercises control over their information but is foregoing what most people think of as privacy.Footnote 24 Notice-and-consent privacy schemes can be likened to the “dummy thermostats” in American office skyscrapers – fake thermostats that foster workplace harmony by giving workers the illusion that they can control their office temperature, which, in fact, is set centrally, with as many as 90 percent of the installed thermostats lacking any connection to the heating and air-conditioning system.Footnote 25 Consent norms foster societal harmony by giving people the illusion that they can control their privacy risks, but, in reality, consent rights are disconnected from privacy and, indeed, exercising consent rights relinquishes privacy.

The loss of privacy is systemic in modern information economies: It is built into the way the economy and society work, and there is little an individual can do. Privacy is interdependent, and other people’s autonomous decisions to share information about themselves can reveal facts about you.Footnote 26 Bioethicists recognize this interdependency in a number of specific contexts. For example, genomic data can reveal a disease-risk status that is shared with one’s family members,Footnote 27 and, for Indigenous people, individual consent to research can implicate the tribal community as a whole by enabling statistical inferences affecting all members.Footnote 28

Less well recognized is the fact that, in a world of large-scale, generalizable data analytics, privacy interdependency is not unique to genetically related families and tribal populations. It potentially affects everyone. When results are generalizable, you do not necessarily need to be reflected in the input data in order for a system to discover facts about you.Footnote 29 If people like you consent to be studied, a study can reveal facts about you, even if you opted out.

Biomedical science aims for generalizability and strives to reduce biases that cause scientific results not to be valid for everyone. These are worthy goals, but they carry a side effect: Greater generalizability boosts systemic privacy loss and weakens the power of consent as a shield against unwanted outside access to personal facts. Whether you consent or refuse to share whatever scraps of personal data you still control, others can know things about you because you live in a society that pursues large-scale data analytics and strives to make the results ever-more generalizable, including to you. Just as antibiotics cease to work over time as microbes evolve and grow smarter at eluding them, so consent inexorably loses its ability to protect privacy as algorithms grow smarter, less biased, and more clever at surmising your missing data.

There is another concern with notice-and-consent privacy schemes in biomedical contexts, where the problem of bias has been empirically studied more than in some other sectors. Selection bias occurs when the people included in a study fail to reflect the entire population that, ultimately, will rely on results from that study.Footnote 30 Consent norms can produce selection bias if some demographic groups – for example, older white males – consent more eagerly than other groups do. People’s willingness to consent to secondary data uses of their health data varies among racial, ethnic, and other demographic groups.Footnote 31 If digital home health tools are trained using data acquired with consent, those tools may be biased in ways that cause them to deliver unreliable results and health care recommendations for members of historically underrepresented population subgroups, such as women and the less affluent.Footnote 32 Consent norms can fuel health care disparities. Admittedly, this is only one of many equity concerns with digital home health tools. The more salient concern, obviously, is whether these tools will be available to nonprivileged members of society at all. Many of these tools are commercially sold on a self-pay basis with no safety net to ensure access by those who cannot pay.

In October 2022, the White House published its Blueprint for an AI Bill of Rights, recommending a notice-and-consent privacy scheme in which “designers, developers, and deployers of automated systems” must “seek your permission” to use data in an artificial intelligence (AI) system.Footnote 33 It simultaneously calls for AI tools to be “used and designed in an equitable way” that avoids disparities in how the tools perform for different population subgroups.Footnote 34 In domains where selection bias is well-documented,Footnote 35 as in health care, these two goals may clash.

IV Medical Privacy Law

One possibility for regulating AI/machine learning (ML) home health tools would be to place them under the same medical privacy regulations – for example, the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule,Footnote 36 a major US medical privacy framework – used for data generated in clinical health care settings. This section argues against doing so.

Medical privacy law rejects control-over-information theory in favor of “privacy’s other path” – confidentiality law,Footnote 37 a duty-based approach that places health care providers under duties to handle data carefully.Footnote 38 The HIPAA Privacy Rule does not itself impose any confidentiality duties. It does not need to do so, because it regulates one specific context – clinical health care – where most of the “covered entities”Footnote 39 it regulates have confidentiality duties under state law.Footnote 40

The Privacy Rule is best modeled as what Helen Nissenbaum refers to as a contextual privacy scheme.Footnote 41 It states a set of “informational norms” – data-sharing practices that have been deemed permissible in and around clinical health care.Footnote 42 The Privacy Rule allows protected health information (PHI) to be disclosed after de-identification or individual authorization (HIPAA’s name for consent).Footnote 43 This leads casual observers to think that it is a notice-and-consent privacy scheme, but it then goes on to state twenty-three additional rules allowing disclosure of PHI, often in identifiable formats, without consent but subject to various alternative privacy protections that, at times, are not as strong as one might wish.Footnote 44

Where medical privacy is concerned, the European Union (EU)’s General Data Protection Regulation (GDPR) is more like the HIPAA Privacy Rule than most Americans realize. It grants leeway for the twenty-seven EU member states, when regulating data privacy in clinical health care settings, to go higher or lower than the GDPR’s baseline consent standard.Footnote 45 A 2021 report for the European Commission summarized member state medical privacy laws, which replicate many of the same unconsented data flows that the HIPAA Privacy Rule allows.Footnote 46

The bottom line is that when you enter the clinical health care setting – whether in the United States or elsewhere – you will only have limited control over your information. A certain amount of data sharing is necessary to support the contextual goals of health care: For example, saving the life of a patient whose symptoms resemble yours by sharing your data with their physician; conducting medical staff peer review to rout out bad doctors; tracking epidemics; detecting child abuse; enabling the dignified burial of the deceased; and monitoring the safety of FDA-approved medical products. Your data can be used, with or without your consent, to do these and many other things considered essential for the proper functioning of the health care system and of society.

Notably, the HIPAA Privacy Rule takes no position on individual data ownership, so state medical records laws that vest the ownership of medical records in health care providers are not “less stringent” than HIPAA and, thus, are not preempted.Footnote 47 In many states, providers legally own their medical records, subject to various patient interests (such as confidentiality and patient access rights) in the data contained in those records.Footnote 48 Some states clarify provider ownership in their state medical records acts; others reach this conclusion through case law.Footnote 49 Only New Hampshire deems the medical information in medical records to be the property of the patient,Footnote 50 and a handful of states provide for individuals to own their genetic information.Footnote 51

What could go wrong if purveyors of digital home health devices were added to the list of covered entities governed by the HIPAA Privacy Rule? The Privacy Rule relies on an underlying framework of state laws to place its covered entities under duties of confidentiality.Footnote 52 Many sellers of home health devices are not bound by those laws. Without those laws, the Privacy Rule’s liberal norms of data sharing could allow too much unauthorized data sharing.

Similar problems arose after 2013, when “business associates” were added to the list of HIPAA-covered entities.Footnote 53 Many business associates – such as software service providers offering contract data-processing services to hospitals – fall outside the scope of the state health laws that place health care providers under duties of confidentiality. The amended Privacy Rule did not address this problem adequately, leaving an ongoing privacy gap.Footnote 54

Placing business associates – or, by analogy, digital home health care providers – under strong duties of confidentiality seemingly requires legal reforms at the state level. Federal solutions, such as HIPAA reforms or the proposed AI Bill of Rights, are not, by themselves, sufficient.

V Content-Based Privacy Protection

A uniform scheme of content-based privacy regulations stratifies the level of privacy protection based on inherent data characteristics (e.g., data about health) without regard to where in the overall economy the data are held. The fact that Sally is pregnant receives the same protection whether it came from a home pregnancy test, a clinical diagnostic test, or a Target™ store’s AI marketing algorithm.Footnote 55 This reasoning has strong superficial appeal, but there may be good reasons to distinguish health-related inferences drawn within and outside the clinical care context.

Some factors justify stronger privacy protections for digital home health data than for clinical health data. In clinical settings, most (not all) unconsented HIPAA data disclosures go to information fiduciaries, such as health care professionals, courts, and governmental agencies subject to the federal Privacy Act. In home care settings, the baseline assumption is that the users and recipients of people’s digital health data are not information fiduciaries, which strengthens the case for strong individual control over data disclosures.

There can be important differences in data quality. Data generated in clinical settings is subject to regulatory and professional standards aimed at ensuring data quality and accuracy. Data generated by home health devices does not always meet these same quality standards. Digital home health data might be inaccurate, so that its release is not only stigmatizing but defamatory (false). Again, this counsels in favor of strong consent norms. Other factors might cut the other way.

The EU’s GDPR and the California Consumer Privacy Act are sometimes cited as consistent, content-based privacy schemes.Footnote 56 Such schemes could offer consistency in a home care system where licensed professionals, nonmedical caregivers, and commercial device companies are all differently regulated. Yet these laws are inferior to the HIPAA Privacy Rule in various respects. An important example is the treatment of inferential knowledge. Under the GDPR, people have access to their raw personal input data but can have trouble accessing inferences drawn from those data.Footnote 57 Wachter and Mittelstadt note that “individuals are granted little control or oversight over how their personal data is used to draw inferences about them” and their “rights to know about (Articles 13–15), rectify (Article 16), delete (Article 17), object to (Article 21), or port (Article 20) personal data are significantly curtailed for inferences.”Footnote 58

The GDPR recognizes the legitimacy of competing claims to inferential knowledge. Inferences are not just a product of the input data from which they were derived, so that an inference “belongs” to the person it describes. Data handlers invest their own effort, skills, and expertise to draw inferences. They, too, have legitimate claims to control the inference. In contrast, the HIPAA Privacy Rule grants individuals a right to inspect, to obtain a copy of, and to request correction of not only their raw personal data (e.g., medical images and test results), but also the medical opinions and inferences drawn from those data.Footnote 59 This is the only informational norm in the HIPAA Privacy Rule that is mandatory: Covered entities must provide people with such access if they request it. The point of this example is that fact-specific analysis is needed before jumping to policy conclusions about which framework is better or worse for digital home health care.

VI Conclusion

This chapter ends where it began, with Solove’s insight that “[n]ot all privacy problems are the same.” The modern generation of digital home health devices raises novel privacy concerns. Reaching for solutions devised for other contexts – such as expanding the HIPAA Privacy Rule to cover digital home health providers or cloning the GDPR – may yield suboptimal policies. Consent norms, increasingly, are understood to afford weak data-privacy protections. That is especially true in digital home health care, where consent rights are not reliably backstopped by fiduciary duties limiting what data handlers can do with health data collected in people’s homes. State legislation to set fiduciary duties for digital home health providers may, ultimately, be a better place to focus than on new federal privacy policies. Medical privacy law reminds us that achieving quality health care – in any context – requires an openness to responsible data sharing. Will those needed data flows exist in a world of privately sponsored digital home health tools whose sellers hoard data as a private commercial asset? The goal of a home health privacy framework is not merely to protect individual privacy; it also must enable the data flows needed to ensure high-quality care in the home health setting. At the same time, the “wild west” environment of digital home health might justify a greater degree of individual control over information than has been customary in traditional clinical care settings. Forging a workable consensus will require hard work, and the work has only just begun.

2 Patient Access to Health Device Data Toward a Legal Framework

Charles Duan and Christopher J. Morten
I Introduction

The connected at-home health care device industry is booming.Footnote 1 Wearable health trackers alone constituted a $21 billion market in 2020, anticipated to grow to $195 billion by 2027.Footnote 2 At-home devices now purportedly make it possible to diagnose and monitor health conditions, such as sleep apnea, diabetes, and fertility, automatically, immediately, and discreetly. By design, these devices produce a wealth of data that can inform patients of their health status and potentially even recommend life-saving actions.Footnote 3

But patients and their health care providers often lack access to this data.Footnote 4 Manufacturers typically design connected at-home devices to store data in cloud services run by the manufacturers themselves, requiring device owners to register accounts and accept the terms of use and limitations that the manufacturers impose. A recent survey of 222 mobile “app families” associated with wellness devices found that 64.4 percent “did not report sharing any data” with other apps or services.Footnote 5 A parent testified in Congress as to how a lack of data access impaired his daughter’s ability to manage Type I diabetes,Footnote 6 and patients with sleep apnea have had to circumvent technological device locks to extract data on their own sleep.Footnote 7 Many medical and wellness devices that patients use for in-home diagnosis and monitoring – which we simply call “health devices” – lock patients into manufacturers’ ecosystems. This limits patients’, and society’s, ability to tap into the full value of the data, despite the extensive individual and social benefits that access could provide.

The problem here is not solely technical; it is also legal. Existing law in the United States provides patients with no guarantee of access to their data when it is generated and stored outside the traditional health care system. The Health Insurance Portability and Accountability Act (HIPAA) provides patients a legally enforceable right of access to copies of their electronic health records (EHRs), and, in recent years, the Department of Health and Human Services (HHS) has moved to make this right enforceable and meaningful.Footnote 8 But as HHS itself has observed about health devices and other “mHealth” technologies used outside the EHR ecosystem, manufacturers “are not obligated by a statute or regulation to provide individuals with access to data about themselves,” so patients with data on such devices “may not have the ability to later obtain a copy.”Footnote 9

This chapter begins by identifying the individual and societal benefits of patient access to health device data. It then addresses the arguments for restricting such access, especially those based on intellectual property laws and policies. We conclude that such arguments are ultimately doctrinally and normatively unconvincing, such that they should not dissuade legislatures and federal agencies from legislating or regulating rights of access. We then consider what can and should be done to create a robust, administrable right of patients to access health device data that protects all stakeholders’ interests, and we offer a nascent framework that draws from other regimes for patient and consumer access to personal information. We hope the framework will guide legislatures and regulators as they begin to address this important issue.

II Benefits of Patient Access

There are important individual and societal benefits when patients can access their own health data. Foremost for individuals is the fulfillment of patient autonomy and dignity. Health device data informs decisions about treatment, so a patient without access can neither make fully informed decisions about a course of care nor evaluate a provider’s recommendations.Footnote 10 Patients may also need access to health device data to “transport” their data to new health care providers for safekeeping,Footnote 11 or to repair their devices.Footnote 12 From a research perspective, patients can and do exploit health device data to useful ends, since their own health stands to benefit from insights and discoveries drawn from that data.Footnote 13 Many patients use health device data for “quantified self” or “n=1” research to discover how best to manage their own health.Footnote 14

Turning to broader societal benefits, a key starting point is the research that is enabled when patient data is aggregated.Footnote 15 For example, the National Institutes of Health (NIH)-run ClinVar database receives genetic variant data authorized for inclusion by individual patients and now contains over two million records representing 36,000 different genes, which public and private enterprises have used to advance research and create consumer products and services.Footnote 16 The ClinVar model of government-supported collaborative dataset-building is one starting point for the idealistic vision of “medical information commons” – the collective, shared governance of medical knowledge (rather than proprietary or authoritarian governance of the same)Footnote 17 – that researchers and regulators alike believe would be a tremendous boon to science.Footnote 18

Research on aggregated health data also allows patient groups and civil society watchdogs to verify manufacturers’ claims and ensure that health devices function as advertised – especially important given that those devices are only lightly regulated.Footnote 19 Aggregated health device data also promises to become a variety of the “real-world evidence” increasingly used to conduct public health research and validate the safety and efficacy of other products the same patients are using.Footnote 20 But these potential benefits depend on patient data aggregated at a sufficient scale.Footnote 21

Societal spillover effects explain, at least in part, why market forces do not prompt manufacturers to satisfy patient demand for data access. Patient self-researchers tend to be consumer-innovators who share their insights and discoveries altruistically, at low or no cost, which may undercut the manufacturers.Footnote 22 And the value of aggregated patient data cannot easily be captured by a single entity. As a result, there is no straightforward way for patients and health device manufacturers to transact for data access.

Another economic disconnect arises from competition among device manufacturers. When patients can easily extract their data from one device and port it to a competing device, they avoid “lock-in,” which promotes patient choice and fosters competition.Footnote 23 In an effort to avoid such competition, however, device manufacturers have incentives to limit patient data access. Indeed, some have implemented technical measures to keep even savvy patients from extracting data and asserted laws against the circumvention of those technological measures to further keep patients from their data.Footnote 24

III Legality of Patient Access

To be sure, there are real concerns with giving patients access to health device data.Footnote 25 Device manufacturers have pointed to these as reasons to limit such access. The main concerns fall into three categories.

First, there are costs associated with authenticating users, formatting data, and otherwise providing access to records. This problem can be solved by permitting reasonable, small charges for data access.Footnote 26

Second, device manufacturers may be better stewards of sensitive health data than patients, in terms of privacy and cybersecurity.Footnote 27 In theory, manufacturers enjoy economies of scale that enable them to protect health records from data breaches and other compromising disclosures, while individual patients may fail to secure their data or fall victim to privacy-invading scams. Yet, there are countervailing considerations: Manufacturers’ vast databases are themselves an attractive and recurring target for data malfeasance,Footnote 28 and some manufacturers’ shady deals with privacy-intrusive data brokers suggest that companies holding volumes of lightly regulated personal data may not be better positioned than patients to protect data security and privacy.Footnote 29

The third concern often raised as a reason to limit patient access is that the data is somehow proprietary to the device manufacturers. This intellectual property concern requires a bit of conceptual unpacking, as it operates on two different levels. First, it is a legal or doctrinal argument, in which the manufacturers assert specific intellectual property rights over the data. Second, it is a normative, policy-oriented argument that exclusive control over patient data is desirable to protect incentives to develop health devices and data ecosystems.

Evaluating these arguments requires distinguishing the types of health device data. First, there is the software code that the device manufacturer writes. Second, the device takes the raw measurements of the patient and stores them. Third, the device (or external software) may perform computations on the raw data to produce values intended to approximate a natural phenomenon, such as a pulse. Fourth, the device may compute data outputs of the manufacturer’s own invention. For example, a device might use pulse measurements across a night to produce a “sleep score,” indicating how well, in the manufacturer’s opinion, the patient slept, and offer recommendations on how to sleep better.Footnote 30

Our focus is the second and third types of information – raw measurements and computed estimates of physiological properties – because they are likely to be of the most interest to patients. We therefore refer hereinafter to these two types of data together simply as “patient data.” With access to this patient data, patients likely will not need to view source code on the device to put the data to use. Manufacturer-specific computations and scores are likely not useful for cross-device interoperability, and the black-box nature of the algorithms often used to compute such scores limits their usefulness for care and research alike.Footnote 31

Two intellectual property regimes are most frequently raised to justify withholding patient data from patients: Copyright law and trade secret protection.Footnote 32 Yet neither provides a genuine doctrinal basis for “ownership” of patient data or barriers to patient access.

Copyright law, which protects creative works of authorship from unauthorized copying, almost certainly cannot justify withholding patient data. Raw physiological measurements and estimates of natural phenomena are facts, ineligible for protection under copyright.Footnote 33 Furthermore, given the immense health benefits that patients can enjoy from their own data, data access likely qualifies as fair use, exempt from copyright infringement.Footnote 34 Indeed, the US Copyright Office has consistently agreed since 2015 that patient access to medical device data is not copyright infringement, thus, permitting patients to circumvent the technological locks that interfere with their access to data on medical devices.Footnote 35

Nor is patient data a trade secret. First, every legal definition of a trade secret requires the information in question be secret to qualify for protection.Footnote 36 Patient data of all sorts is shared with patients, health care providers, and others and, thus, is not actually secret. Second, even if subsets of patient data are kept secret, they are not the sort of information that trade secrecy law protects. To qualify as a trade secret, information must derive “independent economic value” from its secrecy.Footnote 37 As Hrdy has explained, “secret information whose value does not stem from secrecy cannot be a trade secret.”Footnote 38 Unlike traditionally protectable information – manufacturing processes, precise recipes, and so on – patient data derives economic value from aggregation and sharing, not secrecy.Footnote 39

To be sure, some (nonpatient data) aspects of devices’ software and mechanical designs may be deemed trade secrets.Footnote 40 The European Medicines Agency (EMA) offers helpful guidance here, in its official view of the limits of trade secrecy protection of clinical trial data.Footnote 41 (Like the patient data that is the focus of this chapter, clinical trial data describes patients’ health and is enormously valuable to researchers and patients themselves.) EMA announced that a large majority of clinical trial data “should not be considered” proprietary.Footnote 42 In EMA’s view, only “innovative features” of the methods through which data is collected can constitute trade secrets.Footnote 43 EMA expressly defines narrow categories of information it deems innovative and protectable.Footnote 44 These focus on methods for gathering data more quickly or cheaply, such as immunogenicity assays.Footnote 45 Notably, EMA’s categories do not permit proprietary claims to the outcome data that describes patients’ health (analogous to health devices’ patient data); EMA instead mandates that all outcome data be publicized.Footnote 46

What remains of health device manufacturers’ intellectual property claims is a normative argument that data inaccessibility gives manufacturers incentives to innovate.Footnote 47 Yet, there are serious defects to this normative argument. First, patients themselves have a countervailing incentive to innovate – their own health depends on it. Second, the “innovation” manufacturers wish to protect may not be beneficial at all: Secrecy can conceal safety problems, false claims of efficacy, racially biased outcomes, and other defects. Normatively and doctrinally, trade secrecy should not and does not protect this kind of secrecy.Footnote 48 As the Supreme Court has stated, if the disclosure of secret information reveals “harmful side effects of the [trade secret holder’s] product and causes the [holder] to suffer a decline in the potential profits from sales of the product, that decline in profits stems from a decrease in the value of the [product] to consumers, rather than from the destruction of an edge the [holder] had over its competitors, and cannot constitute the taking of a trade secret.”Footnote 49

IV Toward a Regulatory Framework

Although we have argued patients should have access to health device data as a legal and policy matter, the practical fact remains that manufacturers are currently free to build devices that deny such access at a technological level. There is, thus, a need for a legal framework to secure such access. No such framework currently exists: The existing regulations are generally limited to narrow classes of medical records or apply only to traditional health care providers and some of their business associates.

To develop an effective framework, it is useful to survey existing consumer data-access regimes both within the health care system and otherwise. We arrange them into three categories, roughly ranked by the strength of their mandates.

The most powerful regimes mandate patients’ right to data access. The HIPAA Privacy Rule provides patients with “a right of access to inspect and obtain a copy of protected health information” from health care providers.Footnote 50 Similarly, European law and the laws of some states provide consumers with rights to retrieve data about themselves.Footnote 51 These laws employ a range of enforcement mechanisms, including civil actions by consumers, state attorney general investigations, and administrative monetary penalties. For example, the HHS’s Office for Civil Rights recently began penalizing HIPAA-covered health care providers that fail to supply patients’ protected health information upon request or charge excessive fees for them,Footnote 52 prompting improvement after years of subpar compliance.Footnote 53

A second approach is softer financial incentives and disincentives – “carrots” and “sticks” – to encourage data holders to offer access. This was the primary approach used for the adoption of EHRs: The HITECH Act of 2004 both offered providers incentive payments for adopting certified EHR systems in their practices, and imposed a modest penalty on Medicare reimbursements for providers who did not.Footnote 54 Today, after billions of dollars of investment by HHS, the vast majority of providers have adopted EHRs,Footnote 55 and those systems largely comply with HHS’s voluntary certification standards because the financial benefits created sufficient demand.Footnote 56 HHS’s ongoing ability to set certification standards has enabled the agency to require EHR systems to export data in standardized interoperability formats, to expose application programming interfaces for data access, and to stop companies’ “information blocking” practices that hamper patients’ ability to access their own health records.Footnote 57

A third possibility is to build public infrastructure or subsidize private infrastructure that coordinates patient data access. With ClinVar, for example, genetic testing laboratories voluntarily submit annotated reports of genetic variants to an NIH-run database, with patient consent. They make these voluntary submissions because, among other reasons, foundations and publishers often require them as a condition of grants or publication.Footnote 58 The presence of established, stable, government-supported infrastructure for data sharing makes such data submission requirements more common and more effective. In this way, legislatures and regulators can incentivize data sharing even without direct regulation.

We integrate aspects from these regimes into a nascent framework for patient access to at-home health care device data. Our framework-in-progress has three elements: A legal hook to induce device manufacturers to make patient data accessible to patients, a technical standard for data storage and access, and infrastructure for patients to deposit and use their data.

As to the first element, legislation or regulation to compel access, akin to HIPAA, would be most forceful and effective. For example, in 2019, Senators Klobuchar and Murkowski proposed creating a HIPAA-like statutory right of patients “to access, amend, and delete a copy of the personal health data that companies collect or use,”Footnote 59 including data from all “cloud-based or mobile technologies that are designed to collect individuals’ personal health data.”Footnote 60

US states also have substantial authority to legislate around HIPAA and could themselves create statutory patient-data access rights. Texas, for example, subjects some HIPAA-exempt entities, such as schools and public health researchers, to some of the obligations that HIPAA imposes.Footnote 61 The California Consumer Privacy Act (CCPA) arguably creates a right of access to health device data not covered by HIPAA, though this theory is so far untested.Footnote 62

Federal regulators could also explore their existing legal authority to require device manufacturers to share data. For example, the Federal Trade Commission could apply its authority to police unfair and deceptive practices to health device makers that market patient access to data as a feature of their products and require that these companies meet their claims.Footnote 63

Alternatively, following the example of the HITECH Act, Congress could provide financial incentives for health devices that meet data access standards, for example, making such devices reimbursable under Flexible Spending Account (FSA) plans or Medicare. A different, intriguing possibility could leverage the status quo of minimal regulation to create new financial incentives and disincentives. Current Food and Drug Administration (FDA) guidance exempts health devices from clearance and approval requirements only if they “present a low risk to the safety of users and other persons.”Footnote 64 As noted above, patients’ data access can enable researchers to study the safety risks of devices, so it could be reasonable for the FDA to change its policies and extend a presumption of safety (and thus of exemption from regulation) only to those devices that make data accessible to patients – and perhaps to qualified researchers, too. Manufacturers that choose to withhold data would not be, per se, prohibited from marketing their products, but would be subject to stricter FDA oversight, which would come with new costs.

The second element of the framework is a technical standard to govern how data is to be stored and accessed. Since health devices typically store data in manufacturers’ cloud servers, there is little sense in requiring less than electronic access via a network-connected application programming interface, akin to the requirements for EHR systems. Furthermore, both research and interoperability would benefit from greater standardization of data formats, in light of the profusion of health devices and manufacturers.Footnote 65 HHS and its Office of the National Coordinator for Health Information Technology could play an important role here, as it did in the standardization of EHRs.

The third element is an institutional infrastructure for aggregating and sharing data. We propose a public, ClinVar-like repository of patient-authorized submissions of appropriately anonymized device data. Without such a repository, patient access and data interoperability will likely still enable new research and other benefits for patients, but they also could augment the power of firms that amass data and broker access. A government-run repository of patient data arguably has several benefits. As a focal point for data aggregation, it empowers all researchers, not just the largest firms. Also, firms that contribute to this central repository share a relationship with the government that could be leveraged to ensure data privacy and security. And a public repository enables the government and outside experts to think through and develop privacy practices that best protect patients, rather than leaving these questions, in the first instance, to profit-driven firms.

V Conclusion

In this chapter, we have argued for a legal right of patients to access their own health device data. We have begun to trace a legal framework for access, one that includes three key elements: A legal “hook” to coax or compel device manufacturers to share data with patients, a technical standard to govern how data is stored and accessed, and an institutional infrastructure for aggregating and sharing data. We intend to expand on this framework in future work.

3 Challenges of Remote Patient Care Technologies under the General Data Protection Regulation Preliminary Results of the TeNDER Project

Danaja Fabcic Povse
I Introduction

Patients with complex diseases like Alzheimer’s or Parkinson’s often require round-the-clock care. Since caregivers may not always be able to be present, remote care technologies (RCTs) can supplement human caregiver intervention and provide the patient with better care. In the TeNDER project,Footnote 1 we are building technology that will create an alert system for caregivers: For example, if the person falls, their relative or nurse receives a phone alert and can go and check up on them. Such technology relies on remote patient monitoring to detect anomalies in the person’s environment and combines data sources, including electronic health records (EHRs) and data from connected devices (e.g., wearables). The use of these technologies raises questions of data protection since especially sensitive data are involved.Footnote 2

Legal frameworks that govern the use of RCTs are, by their nature, abstract and high-level, meaning that their application might not take into account the specific type of technology or its use in a particular care situation, leaving developers and users in an unclear legal situation.Footnote 3

This chapter aims to bridge the gap between the high-level data protection framework and practical, micro-level application of RCTs by providing an overview of the challenges under European Union (EU) law when developing and using RCTs, exploring how initial results from the TeNDER project on resolving those challenges can help with the practical implementation of similar solutions, as well as examining gaps in the regulation itself. Using these technologies as a starting point, the chapter analyzes the obligations the General Data Protection Regulation (GDPR) lays upon developers in order to address the following research question: “What challenges does the GDPR pose for designers of remote patient care technologies (RCTs), and how can those questions be addressed in practice?”

To answer the research question, the chapter first introduces key legal concerns that data protection poses regarding the use of RCTs, focusing on their field of application and the key principles and obligations relevant to developers. At the same time, the work draws upon the preliminary results of the TeNDER project (2019–2023) to discuss any potential shortcomings in the regulation.

The RCTs discussed in this chapter are in-house, as they are specifically developed to be used remotely, and digital, including digital technologies such as wearables, smart devices, microphones etc. However, TeNDER is not designed to be a medical device and, thus, performs no diagnostics.

II Remote Care Technologies and the GDPR

RCTs are a type of technology that can help patients manage their illnesses better, as well as help elderly people live more independently. They can be used institutionally (e.g., in a care home or hospital) or in the home, where they can contribute to a better quality of life for the user. A variety of different technologies can be used – monitoring devices, smartphones, apps, social media, videoconferencing tools, etc.Footnote 4 RCT is distinct from telehealth or eHealth, which refer to the phenomenon of digital health care in general, while remote monitoring or remote care describes the technology (or technologies) being used. RCT is, thus, a specific technology that is used by health care providers, either in a telehealth or a classical health care setting.Footnote 5

The advent of 5G and the Internet of things, combined with the two years of pandemic, has led to a heightened uptake of telehealth solutions, including remote monitoring applications and wearables that help people age better.Footnote 6 The use of RCTs is especially beneficial for older adults with chronic conditions, for whom monitoring devices, communication tools, and follow-up phone calls enable the 24-hour availability of health management tools.Footnote 7

RCTs, like many other eHealth technologies, rely on advanced data processing techniques and different devices, both medical and general-purpose ones, to provide functionalities. The devices and technologies must, at the same time, meet the goals they were designed for and ensure patients’ privacy and safety.Footnote 8 In terms of data privacy, patients risk losing control over their health data – especially when it comes to their EHRsFootnote 9 – when remote monitoring devices, such as wearables, are used.Footnote 10 Elderly users may not have consented to the processing of their health data; they may consider monitoring devices as a form of spying upon their private lives.Footnote 11

The GDPR,Footnote 12 adopted in 2016, binds controllers and processors involved in the processing of health data to put in place appropriate technical and organizational mechanisms to ensure patients’ data protection and the confidentiality of medical information.

The first issue is determining the GDPR’s scope of application to RCTs. The regulation applies when personal data, defined as “any information relating to an identified or identifiable natural person (‘data subject’)” (art. 4(1) of the GDPR), are being processed, meaning “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring,” and so on (art. 4(2) of the GDPR). Data concerning health (also referred to as health data) are defined as “personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status” (art. 4(15) of the GDPR).

How can we determine what constitutes personal data in a remote care scenario? As per the definition of art. 4(1), as long as information can be linked to a data subject, it is considered personal data. Since the scenario deals with a health care setting, health data are very likely going to be processed. More specifically, the 2007 opinion of the Article 29 Working Party states that “all data contained in medical documentation, in electronic health records and in EHR systems should be considered to be ‘sensitive personal data.’”Footnote 13 However, data that cannot be linked to a data subject is not considered personal data, for example because it has been irreversibly anonymized.Footnote 14

The regime under the GDPR is centered on a data controller, a central entity in charge of the processing activity, which determines the purposes and means of the processing (art. 4(7) of the GDPR). In order to process data, a controller must comply with data quality principles, such as data minimization and accuracy (art. 5(3) and 5(4) of the GDPR, respectively), and ensure the existence of valid legal grounds, as per art. 6 of the GDPR. Controllers can engage processors to help them carry out the processing operation – art. 4(8) of the GDPR defines a processor as a natural or legal person, public authority, agency, or other body which processes personal data on behalf of the controller.

Since RCT relies on different technologies and different service providers, defining the controller and the processor may be difficult. Recent decisions of the Court of Justice of the EU, such as WirtschaftsakademieFootnote 15 and Fashion ID,Footnote 16 as well as advisory opinions,Footnote 17 point to an “essential means” test. Essential means are key elements which are closely linked to the purpose and the scope of the data processing, such as whose data will be processed, which data types, for how long, and who will have access to them. The entity that determines the essential means of processing is, therefore, the data controller.

Determining the controller is important for ensuring that the right party can demonstrate compliance with the applicable principles and obligations (“accountability” – art. 5(2) of the GDPR). Among them are the data quality principles of art. 5(1): Lawfulness, fairness, and transparency; purpose limitation, data minimization, accuracy, storage limitation, integrity, and confidentiality. The controller is further responsible for implementing appropriate technical and organizational measures ensuring compliant processing (art. 24(1) of the GDPR) and for building privacy into the system by design and by default (art. 25(1)–(2) of the GDPR). Moreover, proactively implementing data protection during the development process helps eventual adopters in ensuring compliance, especially with the data protection by design approach.Footnote 18

III The TeNDER Approach

The TeNDER project, funded by the Horizon 2020 mechanism, seeks to empower patients with Alzheimer’s, Parkinson’s, and cardiovascular diseases, by helping them to monitor their health and manage their social environments, prescribed treatments, and medical appointments. It follows an integrated care model, linking both medical and social aspects, such as (mis)communication and the fragmentation of care. The development process combines existing technologies, such as smartphones, wearables, and sensors, in order to monitor vital signals or alert a caregiver in case of an accident or fall, always consulting with patients to account for their preferences.Footnote 19

As a research project, TeNDER crosses a number of different legal frameworks. Concerning the development process, we have focused on the requirements found in the GDPR, such as the legal basis for processing health data, privacy by design, and pseudonymization measures, and addressed the potential applicability of the Medical Devices Regulation. Once the results are finalized and marketed to health care organizations and caregivers, the preliminary legal findings, contained in several reports conducted through the lifecycle of the project, can serve as guidance to adopters.

In the project, we have adopted a three-step methodology to address the gaps in the regulation of eHealth technologies and to establish good practices for lawful and ethical implementation. First, a benchmark report identified applicable laws and ethical principles in abstracto and analyzed the initial concerns of the nexus between technology and applicable frameworks.Footnote 20 Building upon its findings, the three follow-up impact assessments take into consideration privacy, data protection, ethical-societal aspects, and the regulation of medical devices.Footnote 21 The final legal report, released in April 2023, provided an evaluation from legal and ethical perspectives of the technologies developed during the project, as well as recommendations for future adopters.Footnote 22

Since the development of eHealth products necessarily takes place in a controlled environment, with a limited number of participants and the roles of different providers known in advance, the legal requirements in a post-project, real-life setting may vary slightly. For example, if the pilots in the project are based on small patient groups, a data protection impact assessment (DPIA) is not always necessary as per art. 35 of the GDPR, while in a larger organizational context it may well be obligatory.Footnote 23

IV Addressing Data Protection Challenges: Lessons Learned in TeNDER
A Roles and Obligations

In a remote care scenario, the controller will be processing patients’ health data, which are considered particularly sensitive due to the data’s intimate character. Therefore, a stricter regime applies: Under art. 9, the processing of health data (and other special categories of data) is not permitted, unless one of the criteria in art. 9(2) is met. In this kind of scenario, that could be the explicit consent of the data subject unless prohibited under EU or national law (art. 9(2)(a)). Alternatively, the processing of health data is permitted if the processing is necessary for protecting the vital interests of the data subject, or another person when the data subject is incapable of giving consent (art. 9(2)(c)), such as when the patient is unconscious following an accident. Finally, processing is also permitted if the personal data have been made manifestly public by the data subject (art. 9(2)(d)), which happens when the data are already available to the caregiver or have been published on a social media platform.

In the TeNDER project, we identified legal grounds for consent from art. 6, with the explicit consent from art. 9(b) as an exemption from the art. 9(a) prohibition of processing. However, as many patients with Alzheimer’s and Parkinson’s diseases experience a decrease in cognitive function, ensuring the informed-ness of their consent can be a challenge. While the GDPR contains special rules for children’s consent (art. 8 of the GDPR), there is no similar rule for obtaining informed consent from incapable adults, nor is this gap addressed in the relevant guidelines of the European Data Protection Board (EDPB).Footnote 24

To resolve this legal gap and ensure that patients were fully briefed, they were provided with both lengthy and simplified information sheets, following bioethical recommendations contained in several (nonbinding) international documents, such as the Declaration of Helsinki and the Council of Europe Recommendation No. R(99)4 on Principles Concerning the Legal Protection of Incapable Adults.Footnote 25 While these are not requirements for consent under binding law, they contribute to better involvement of patients with Alzheimer’s in research projects.Footnote 26

In order to address data protection requirements, we must first identify the controllers and processors involved. In the TeNDER project, we employed fitness wearables in combination with RGB skeleton cameras and microphones, which were placed in different care settings – a retirement home, rehabilitation room in the hospital, day care center, etc. This meant that the user partners, such as health care organizations, were acting as data controllers, since they had determined which tools they would use (the means) and what kind of care or therapeutic outcomes (the purposes) would be achieved using those means. Technology providers, both external and part of a consortium, acted as data processors, carrying out the instructions given by the controllers. The patients enrolled in the evaluation pilots were recruited by the health care providers and represent the data subjects in this scenario.

To ensure an appropriate techno-legal conversation, the user partners and technology providers (i.e., the controllers and processors) were asked to provide feedback by means of impact assessment questionnaires. Their feedback has informed our approach to solving the specific challenges described below.

B Specific Challenges of the TeNDER Remote Care Technology
i Data Sharing with a Third-Party Service Provider

The responsibility of the controller for ensuring compliance with the data protection requirements is complicated by the fact that many RCTs are provided by external providers. To a certain extent, the privacy risks can be mitigated by measures taken by developers and users, including patients, caregivers, and organizations. These counter-measures can help minimize the amount of data processed by external parties when opting out of data sharing is not possible. Normally, the controller and the processor will adopt relevant agreements, such as the controller-processor agreement (art. 28(3)) of the GDPR; however, with external service providers that is sometimes not feasible, and the terms of use/terms of service apply instead.

Data protection in the wearables market calls for special attention as the functionalities of wearables become even more sophisticated and provide for wide-ranging data collection. Personal data of the most intimate nature – activity, moods, emotions, and bodily functions – can be combined with other sources of data, raising such potential harms as discriminatory profiling, manipulative marketing, and data breaches.Footnote 27 The lack of data privacy protections could be addressed by a greater adoption of the data protection by design principle and more transparency, especially regarding privacy policies.Footnote 28

At TeNDER pilot sites, we used fitness wearables, such as the Fitbit, to follow up on patients’ rehabilitation and daily routines by tracking events such as energy expenditure, sleep, and activity. The wearables were connected to smartphones and tablets, and the data from the wearables was extracted to paint a comprehensive picture of a patient’s movement.Footnote 29

The potential access of Fitbit to the data on the device and the wearable, as the service provider, has been identified as a potential challenge. The Fitbit blog provides some tips on enhancing privacy and data protection while using their services, including going incognito, editing the profile and display name, making personal stats (such as birthday, height, and weight) private, hiding badges, and adjusting for different location settings.Footnote 30 However, generally opting out of data sharing with the service provider is not possible. Considering the TeNDER project involves very vulnerable populations, additional safeguards were adopted in the process: Setting up dedicated accounts and email addresses, using devices specifically for the project purposes, and avoiding real names or specific dates of birth as much as possible. These safeguards contribute to the implementation of the principle of data minimization, set in art. 5(1)(c) of the GDPR, which is one of the keystones of privacy and data protection by design.Footnote 31

ii Infrared Cameras and Accidental Capture

In the pilots, we plan to use infrared cameras to keep track of patients’ rehabilitation processes and to alert the caregiver should the patient fall. However, cameras can accidentally capture other people aside from the patient.

Our approach was based on the GDPR and the opinion of the EDPB.Footnote 32 A video system used to process special categories of data must be based on valid legal grounds as well as a derogation under art. 9. Since TeNDER is a research project, informed explicit consent was collected from the patients prior to the data processing. Adopters in a research setting could rely on the derogation of “scientific research purposes” under art. 9(2)(j), where obtaining explicit consent could not be feasibly done. In this regard, it is noteworthy that the GDPR provides that the term research setting “should be interpreted in a broad manner, including for example technological development and demonstration.” However, since accidental capture can happen to an undefined audience, relying on their consent is not realistic. In the EDPB’s opinion,Footnote 33 the legitimate interests of the controller are suggested as an alternative legal basis. However, this basis cannot be relied on if the data subject’s rights and interests outweigh the legitimate interest. Considering that RCTs involve health data, it is difficult to see how that would meet the legitimate interests balance test.Footnote 34

To avoid accidental capture in the pilot, the infrared cameras, which process skeleton outlines without biometric data or identifying facial characteristics, will only be used in physiotherapy sessions as part of the rehabilitation room pilot.

iii Integration with EHRs

In order to ensure a more comprehensive overview of a patient’s medical history, the development phase includes integrating electronic health records (EHRs) into the system. Clinical history will, later in the project, be matched with data from other devices to ensure an integrated care service. In data protection terms, this contributes to the data accuracy principle. This principle requires that personal data must be accurate and, where necessary, kept up to date, and that inaccurate personal data must be erased or rectified without delay (art. 5(1)(d) of the GDPR). Where patient data is concerned, this principle is very important to ensure the appropriate treatment of the patient, especially if data are going to be fed into artificial intelligence (AI) systems.Footnote 35

One of the challenges in the EU is the diversity of EHR data formats in different member states. To this end, the Commission has adopted a “Recommendation on a European Electronic Health Record” (REHR) exchange format.Footnote 36 According to its Recital 10, the goal of the REHR is the interoperability of different EHRs and to allow for processing information in a consistent manner between those health information systems, so that the provision of cross-border health care services (including remote care) becomes easier for the patient. REHR is a voluntary interoperability system – member states that sign up should ensure that at least the following data points should be interoperable: Patient summaries, e-prescriptions and e-dispensations, laboratory results, medical imaging and records, and hospital discharge reports (point 11 of the REHR).

Since EHRs involve patient data, the link to the GDPR is clear. To set up the system in accordance with the data protection framework, the development follows the Article 29 Working Party’s guidelines on EHR.Footnote 37 Even though this document was released on the basis of the Directive 95/46, many of its principles are still relevant under the new regime. Among the recommendations of the document are strong access controls and authentication measures for the patient and the health care professional; further use of information contained in the EHR only for legitimate purposes, such as providing better treatment; and data security and data minimization measures, such as separate storage of especially sensitive data.Footnote 38

The integration of electronic health care records is still in progress, and its legal aspects will be evaluated at the end of the project. The techno-legal collaboration on EHR integration has, so far, focused on two aspects: The mapping of applicable legal frameworks, as described in the above paragraphs, and their take-up by developers in order to build the products.Footnote 39

iv Preliminary Results: Essential Data Protection Requirements for Developing Remote Care Technologies

The main takeaway from our work in the TeNDER project so far can be summarized as a set of essential requirements for potential future developers and users of similar technologies. This is by no means an exhaustive list – as explained above, unlike real-life health care settings, research projects are a controlled environment with highly formalized procedures aimed at developing and testing technologies. In contrast, organizations who adopt RCTs for their own patients may be required to comply with additional obligations, including carrying out a data protection impact assessment as required by art. 35 of the GDPR or adopting processing agreements under art. 28(3), enabling data subject rights requests (especially the right to access) and the portability of health care records, and so on. While the system is being developed in line with the GDPR, future end-users will play a major role in complying with data protection and other sectoral or national laws. An expanded list of the requirements summarized below in Table 3.1 is available in the last legal report of the project, published in April 2023.Footnote 40

Table 3.1 Essential data protection requirements for RCTs: Preliminary results of TeNDER

Role in RCTPotential data protection roleEssential requirements
Developers and technology providersPotential processorsDesign RCTs according to the principles of data protection by design and by default (art. 25 of the GDPR), especially when different devices and tools are being used, such as in the case of EHR integration. This will also operationalize the principle of data minimization: no other personal data than that which is adequate and relevant to the specific purpose will be processed.
If EHR are fed into the system, ensure the data contained in the records are accurate and kept up to date, as per art. 5(1)(d) of the GDPR.
Assess whether they are a processor under art. 4(8) of the GDPR (the entity that carries out the processing on behalf of the controller) and take the required measures, such as notifying the controller (the health care organization) about the involvement of other processors (third parties such as external providers of RCTs or other technologies).
Users (health organizations)Potential controllersApply technical and organizational measures to ensure general compliance with data protection rules (art. 5(2) and 24 of the GDPR).
Ensure valid consent is given. Since many of the patients enrolled in the pilots are experiencing cognitive decline, the information given must be appropriate to the patients’ level of understanding. Preferably, a trusted person should be involved in the process of obtaining consent (e.g., a family member or other caregiver).
If using cameras or other especially intrusive technologies, consult the patients on their placement within the room, and inform them of the option to turn the device off.
Keep data in the EHR accurate and up to date; respond to patient requests for rectification of their medical information.
Users (patients)Data subjectsThe onus to maintain data protection and security measures is on the developers and health care organizations, not on the user (the principle of data protection by default).
When using third-party devices and opting out of data sharing is desired but not possible (e.g., in the case of wearables), use mitigation measures, such as using pseudonyms instead of names, inputting approximate date of birth, not connecting the device to social media presence, etc.
V Conclusion

What do the findings of this chapter mean for the development of RCTs? I have taken a two-pronged approach and discussed the application of selected legal provisions to RCTs in general, against the application of the same provisions to specific technology developed as part of the TeNDER project. While it may not be possible to fully resolve the tension between particular technologies and abstract legal frameworks, in general, knowing how to interpret the law can bring us closer to bridging the gap.

Responding to the data protection challenges of developing RCTs involves both a technological and organizational angle, such as using different tools in appropriate contexts (e.g., cameras in the rehabilitation room rather than in patients’ homes), as well as legal solutions (e.g., applying additional safeguards to ensure the informed-ness of the patients’ consent). What is acceptable to patients who are receiving remote care in the privacy of their own home, rather than in health care organizations, as well as what kind of technological development is feasible, should be further explored by interdisciplinary, socio-technological-legal research. Nor are all the legal questions resolved, such as the lack of legal provisions under the GDPR that safeguard the consent of persons with cognitive decline. The same problem applies regarding the role of the terms of use of service providers in ensuring that the external processors will comply with the data protection rules.

The scope of this chapter is likewise limited by the scope of the project itself. Since the latter is largely concerned with development, this chapter explores the development process as well, rather than the eventual use of the products in health care organizations after the end of the project. Further, the project will be running for another year, and the results reported in this chapter are preliminary as of the spring of 2022. Legal findings will mature together with the technology, and some of the legal aspects concerning the future use of the TeNDER technologies will be clearer at the end of the development and testing phases.

4 Renegotiating the Social Contract for Use of Health Information Lessons Learned from Newborn Screening and Implications for At-Home Digital Care

Jodyn Platt and Sharon Kardia
I Introduction

At-home digital and diagnostic care has expanded in the wake of the COVID-19 pandemic. This change has set off a cascade of secondary effects including new pathways for information flows with an array of direct-to-consumer companies and products, alternative uses of information for health, and a renegotiation of space by shifting when, where, and how we interact with the health care system. This new landscape requires a reexamination of the implicit and explicit social contract between patients, clinicians, and the health delivery system. At-home digital care involves monitoring patients outside of the clinic walls and increased data sharing between traditional care providers and the private companies that build devices. For example, Cue Health offers testing for COVID-19, with the results sent to an app on a personal smartphone and to providers who can provide follow-up treatment.Footnote 1 The expansion of at-home digital care raises a number of ethical and policy questions: How is health information shared and with whom? What is the appropriate role of commercial companies? Are people who continue to receive care in clinical settings subject to the new norms of at-home care with respect to remote patient monitoring or data sharing?

Many of these questions have been raised before. Technology and circumstance have often driven change in health care, with policy playing a formative role. The electronic medical record, for example, was rapidly adopted as the American Recovery and Reinvestment Act of 2009 and the Health Information Technology for Economic and Clinical Health Act (HITECH) were passed in response to the 2009 financial crisis in the USA. These acts of legislation led to the investment of billions of dollars in health information infrastructure, and the widespread adoption of the electronic medical record meant that data could be collected, stored, and (ideally) readily shared to support learning, health care systems,Footnote 2 precision health,Footnote 3 and comparative effectiveness research.Footnote 4 Subsequent policies in the 21st Century Cures Act have continued this investment and commitment to incentivizing interoperability and data sharing.

In clinical research, the Human Genome Project similarly sparked innovation in research information infrastructure that enabled shared data and biospecimens, often in the context of biobanks. The number of large population biobanks housing millions of biological samples linked to individuals’ health data has increased over the past decades in response to demand for the scientific and economic efficiencies that multi-use biobanks offer.Footnote 5 Technological advances have made it simpler, safer, and more inexpensive to measure vast arrays of molecular data (e.g., genome-wide chips for DNA, RNA, and methylation), as well as to catalogue and store sensitive health information (e.g., barcoding, robotic retrieval, encryption, and firewalls). In the United States, biobank repositories have emerged primarily from large health systems (e.g., Kaiser Permanente, Marshfield Clinic, Veterans Administration) and research institutions (e.g., Vanderbilt University) as natural extensions of the data collection and research already underway therein.Footnote 6

The rapid adoption of new technologies impacts health care culture, care delivery pathways, payment, patient engagement, and, ultimately, the social contract between patients and the systems that care for them. In this chapter, we examine the emergence of the Michigan BioTrust for Health in 2009 as an instance of renegotiation of the social contract between stakeholders in response to new technologies and evolutionary changes in the scientific and health enterprises. Based on prior research on the ethical and policy implications for patients that were part of the legacy system (i.e., those being asked to make the change from old to new systems of care), we review the key findings on attitudes about informed consent, notification, and partnerships with commercial companies, and consider the implications for the governance of at-home digital health care.

II From Newborn Screening to the Michigan BioTrust for Health

With a century-long history of collecting, storing, and analyzing information for surveillance and monitoring community health, public health departments are potentially major contributors to the growing number of large population biobanks. For example, the residual newborn screening bloodspots that health departments collect and store are almost fully representative of a population, as they contain blood samples from ~99.9 percent of children born in a particular state. From an epidemiological perspective, this resource is the gold standard for population health assessment and research, given its completeness and lack of ascertainment bias. If made available or even marketed as public health biobanks, these repositories could contribute to robust population health studies when linked to a wide range of public health surveillance databases. And yet, the repurposing of newborn screening bloodspots to include research use challenges the expectations under which they were collected.

In 2009, the state of Michigan endeavored to pursue expanded uses of newborn screening bloodspots by opening the Michigan BioTrust for Health as a steward organization, tasked with navigating the data governance challenges inherent to the large-scale aggregation of medical information. Michigan’s BioTrust for Health holds bloodspot cards for over four million children born in the state of Michigan and is one of the largest biobanks in the USA. The BioTrust is run through a nonprofit organization, the Michigan Neonatal Biobank, providing health researchers with access to de-identified samples and information, contingent on scientific review, institutional review board (IRB) approval, and payment. The biobank comprises a retrospective (“legacy”) collection of approximately four million bloodspot cards stored from babies born in Michigan between July 1984 and April 2010 – before consent mechanisms were put in place – along with a prospective collection of dried bloodspots added to the biobank since its formal inception in Fall 2010, and included in the research pool only with a written consent.Footnote 7

III Consumer Preferences for the Use of Newborn Screening Bloodspots and Health Information: Implications for Digital Health at Home

Over the course of approximately five years (2009–2015), we conducted several empirical studies assessing consumer perspectives on the uses of newborn screening bloodspots, including preferences for consent and notification to understand. This work focused on the so-called “legacy collection” of bloodspots held by the Michigan Department of Community Health (MDCH) and collected prior to policies being put in place for obtaining consent for research uses. There were approximately four million people with bloodspots in the BioTrust who fell into this group. We held ten community meetings across the state of Michigan (n = 393),Footnote 8 met with college students at 20 campuses (n = 2,010),Footnote 9 and conducted an online deliberative jury (n = 67).Footnote 10 We also conducted surveys, including three cohorts of the State of the State Survey (n = 2,618) and a simulated dynamic consent process (n = 187).Footnote 11 To try to reach a greater proportion of people in Michigan, we conducted a Facebook campaign that reached over 1.8 million people.Footnote 12 In this section of the chapter, we draw on the published work in this area, as well as our own reflections on it nearly ten years later, to describe what we learned about three key issues that are likely to shape ethical and policy assessments for at-home digital care: (1) Preferences for consent and notification, (2) relationships with commercial companies, and (3) trust and governance.

A Consent and Notification

Our findings with respect to expectations for consent and notification were consistent throughout our work on the BioTrust.Footnote 13 We found that a clear majority of people would like some form of notification. With respect to consent, preferences were divided. When offered a choice between providing a one-time “broad consent” that allows for unspecified future uses versus providing consent for each use of bloodspots, we found that about half of the people we interviewed or surveyed prefer a one-time notification and about half want to provide informed consent for specific uses of their information. These findings were consistent with other research on preferences for consent in similar activities, such as large-scale, longitudinal cohort studies.Footnote 14 We also found that feelings of respect and trust predicted preferences for broad versus specific consent. Specifically, those who see specific informed consent as important also see consent as an important sign of respect and may have less trust in the health system, while those who do not need to provide consent every time are more trusting of the health system.

Expectations for informed consent for the collection of data for research are well-established, while there are none for data used in the context of public health or quality improvement. Notification of data sharing is addressed in the Health Insurance Portability and Accountability Act (HIPAA) regulations, but, in practice, it is a blackbox for consumers. Developing, implementing, and maintaining consent for research is one of the greatest practical barriers in creating public health biobanks or repurposing the use of public health data and biological samples. Operationalizing consent depends on whether proposed research uses already-existing samples and databases, or if the research requires samples and data to be collected prospectively. For newborn screening, it would be impracticable for many states to obtain individual consent given the age of the data or the number of samples. In Michigan, the federal Office of Human Research Protections advised the MDCH that its storage and use of newborn screening bloodspots constituted human subjects research necessitating IRB review. The MDCH IRB stated that new samples would need documentation of consent. The existing four million samples could be issued a waiver of consent based on the impracticability of contacting subjects individually, contingent upon a good-faith effort to inform the public that the repository exists and that there are clear processes for those who choose to withdraw.

Digital health at home faces a similar quagmire of ethical and pragmatic challenges to implementing consent or notification. There are complex contingencies to the social license that purveyors of digital health face; trust in their services depends on the service being provided, their consumer base, the quality of the product, and the risk associated with faulty products.Footnote 15 At present, informed consent in digital applications is reduced to the notification of privacy policies. Cue Health, for example, which rapidly specialized in at-home COVID-19 testing and services, addresses the collection, use, sharing, and privacy of data gathered from patients participating in their website, app, and testing services.Footnote 16 Updates are posted on the website, meaning consumers need to check for updates rather than being notified directly. Consent is further complicated by the complex set of relationships required to deliver care and the limited responsibilities of any one actor. The Cue Health privacy policy (typical of this type of service and application) notes that they may link to outside websites and services for which they are not responsible. This leaves the responsibility for notification, in essence, up to consumers themselves to follow from one use and user to the next. Our experience with the BioTrust suggests this is not sufficient and that the future of digital health at home would benefit from greater levels of specificity and higher standards for quality of informed consent and notification that account for the full spectrum and scope of data sharing.

B Comfort with Commercial Companies

One factor that drove the expanded use of newborn screening bloodspots for research is the potential use of the resource by commercial companies. The use of newborn screening bloodspots for research was hailed as a goldmine.Footnote 17 Our research has revealed the desire for greater transparency about partnerships with commercial companies, calling for policies of “disclosure plus” that take extra measures to communicate about the commercial aspects of research.Footnote 18 In our qualitative work, we have found that many people are acutely aware of commercial partnerships as a reality of health systems in the United States. Beyond this common recognition, there were two attitudes about this aspect of the biomedical enterprise that often lay in tension with one another. First, there were those who already had a mistrust of the system and considered profit-seeking as evidence that the government and/or the medical community could not be trusted. Second, there were those who saw commercial partnerships as a benefit to society that should be an object of investment. For both groups, demonstrating the benefits of sharing health information, and to whom they accrue, is a way of being accountable to the trust given to the public health system as being good stewards of information. Our experience was consistent with the findings in contemporary literature on the issue of the commercialization of biobanks.Footnote 19

For biobanks and, more recently, health care systems, the consequence of mingling the business aspects of information with expectations of responsible stewardship has been volatile. In managing public health information as a marketable biobank, the relationship of a health department to the public becomes a critical consideration. Accusations of the Texas Department of Health bartering with newborn screening bloodspots still resonate today.Footnote 20 The University of Chicago faced litigation after it partnered with Google to analyze health records to develop digital diagnostics.Footnote 21 Memorial Sloan Kettering entered a deal with Paige.AI to hold an exclusive license to tissue slides and pathology reports for twenty-five million patients, causing an “uproar”: Concerns over the commercialization of patient data – even if it is anonymized – renewed interest in the scope and significance of conflicts of interest.Footnote 22 Rational people could argue for both sides of each of these cases. The case against the University of Chicago, for example, was eventually dismissed, and Sloan Kettering issued a statement clarifying the relationship between the institution and Paige.AI.Footnote 23

Each of these cases suggests that the risk of navigating in the “gray zone” is, at the minimum, a betrayal of trust as a harbinger of what may come for the companies and health systems moving out of the clinic and laboratory and into the home. Commercial companies are an integral part of the expansion of at-home care that is digital and diagnostic, but a policy of “disclosure plus” for at-home digital health is complicated given the nature of the digital health ecosystem and the lack of clear chains of accountability. Regulatory modernization will need to be a priority as partnerships become more ubiquitous. Novel strategies for licensing data, for example, might be pursued to give consumers greater control over how their health information is used and how profits are shared to promote the use of data as a public good. Novel policy regimes such as this can address the lack of transparency about commercial data use. They can also promote autonomy and respect for persons – the goal of informed consent – in an environment in which informed consent is not feasible or practicable.

C Trust and Governance

The use of newborn screening bloodspots for research demanded a shift in the terms of use. Such renegotiations have happened before – and will continue. Experience suggests that such shifts are motivated by a promise to improve public health and health care delivery systems, but they also raise questions of equity and challenge the public’s trust in the biomedical enterprise. The seminal case settled by Arizona State University and the Havasupai Indian Tribe underscores the importance of communicating the scope and nature of the use of samples and data to research participants.Footnote 24 At issue was the secondary use of data and samples without the permission or knowledge of the participants, a fact that deeply offended tribal leaders, leading not only to a lawsuit, but also to an effective moratorium on medical research in that community and a rift in a partnership that had taken decades to build.Footnote 25 A distrust of research and public health continues for many in African American communities, where past public health programs, such as sickle cell screening in the 1970s, were implemented unjustly. A failure to invest in appropriate education about sickle cell anemia resulted in genetic discrimination in the form of discriminating and stigmatizing marriage laws.Footnote 26 In our work with communities in Michigan, we often heard skepticism that key stakeholders would be included: For example, “Can I truly trust you? African American people are always last to know. I want involvement and information.” We also heard a concern about a slippery slope of hidden data collection and use: “What other lab specimens are being taken without the knowledge of the person being tested? This will end as a trust issue….”Footnote 27

Public health biobanks that use newborn screening information and biospecimens are unique in their inclusivity, and yet the policies and practices that stem from the use of health information may be discriminatory and inequitable. At the same time, the collection of data when it is used for health often faces fewer barriers and is treated as exceptional when compared to other types of information. Public health data is often collected without consent, but as an activity of a public institution makes it accountable as such, expanding the use of data to include research and research institutions demands a new layer of accountability and a demonstration of the trustworthiness of both the stewards (i.e., public health bodies) and the users of health information.

The risk associated with the collection of information without ongoing governance to ensure fair use of the information longitudinally is exemplified by the 2009 Beleno v. Texas Department of State Health Services case, in which the Department of Health settled by agreeing to destroy their repository of five million bloodspots collected as a part of their newborn screening program.Footnote 28 Reporters reviewing nine years-worth of emails at the health department found evidence that the department suffered from a lack of guidance or policies to handle novel requests for biobanked data.Footnote 29

Digital health operates as a market that lacks clear governance and ethical guidelines. Trustworthiness of the enterprise as a whole is a goal, but it is unclear who should be involved in oversight. The limitations to accountability for any one actor leaves consumers with the responsibility of tracking privacy policies from one user to the next. Innovation of traditional governance mechanisms is needed to temper special interests and meaningfully manage conflicts of interest. Obtaining meaningful community awareness would require an investment in outreach and education for large, diverse populations through novel governance structures that engage the range of stakeholders and actors in the digital health ecosystem. This provides an opportunity to apply principles that emphasize equity and inclusion such as “centering at the margins,”Footnote 30 that is, including minoritized people and interests.

IV Conclusion

The experience of biobanking residual newborn screening bloodspots matters not only because these repositories are vast, valuable, and politically volatile, but also because they are harbingers of the ethical and policy issues that will continue to arise in this new era of integrated health information technology and digital health at home. Learning from the public about data and biospecimen use in the context of the BioTrust suggests that the future of digital health at home would benefit from clear expectations and mechanisms for consent and notification. Those who prefer greater involvement in informed consent also see consent as an important sign of respect and may have less trust in the health system. Furthermore, demonstrating the benefits of sharing health information, and to whom they accrue, is a way of being accountable to the trust given to information systems – be they public or private – as being good stewards of information. Novel strategies for licensing data, for example, might be pursued to give consumers greater control over how their health information is used and how profits are shared to promote the use of data as a public good.

Both newborn screening and at-home digital health care are examples of data-generating activities that create information that is of potential value beyond its original intended use. For newborn screening, public health interests justified the original data collection, while research benefits justified the expanded use of those bloodspots. In the case of at-home digital health care, launching digital modalities involves a wider range of entities, including commercial consumer technology companies and a broad scope for data sharing. Public health biobanking has raised issues for consumers with respect to consent and notification, the role of commercial companies, and sustainable governance. Underlying these issues are questions of how to sufficiently notify consumers about the use of their data, how to negotiate the commercial interests in their data, and how to engage and empower the public as a key stakeholder. The issues raised around newborn screening biobanks presented in this chapter suggest that governance should include policies for access, conflicts of interest, and equity, while investing in outreach and education so that patients are informed and transparency is both meaningful and maintained. As a rapidly expanding area of health care, digital health at home has an opportunity to create new avenues for access and equity that may be honored first by assessing its guiding principles, and then by creating systems of governance and engagement that improve upon the current system of care.

Footnotes

1 In the Medical Privacy of One’s Own Home Four Faces of Privacy in Digital Home Health CareFootnote *

* The author thanks the Health Policy and Bioethics Consortium of Harvard Medical School and the Harvard Law School Petrie-Flom Center for the opportunity to receive feedback on an early draft of this chapter at the February 11, 2022 virtual meeting entitled, “Diagnosing Alzheimer’s with Alexa?” The author has no conflicts to disclose.

1 See Eric R. Claeys, Kelo, the Castle, and Natural Property Rights, in Private Property, Community Development, and Eminent Domain 35, 35–36 (Robin Paul Malloy ed., 2008) (discussing the metaphor of the home as one’s castle).

2 Daniel J. Solove, Conceptualizing Privacy, 90 Calif. L. Rev. 1087, 1147 (2002).

3 Julie Cohen, Privacy, Visibility, Transparency, and Exposure, 75 U. Chi. L. Rev. 181, 190–91 (2008); Solove, supra note 2, at 1137.

4 Ateeq Mubarik & Arshad Muhammad Iqbal, Holter Monitor, StatPearls (2022), www.ncbi.nlm.nih.gov/books/NBK538203/. See also Moises Rivera-Ruiz et al., Einthoven’s String Galvanometer: The First Electrocardiograph, 35 Tex. Heart Inst. J. 174 (2008).

5 Mubarik & Iqbal, supra note 4.

6 Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 UC Davis L. Rev. 1183, 1205 (2016).

7 Shulman v. Group W Prods., Inc., 955 P.2d 469 (Cal. 1998).

8 Balkin, supra note 6, at 1187.

9 Barry R. Furrow et al., Health Law: Cases, Materials and Problems (8th edn.) 117 (2018).

10 Id.

11 See, for example, Am. Med. Ass’n, Code of Medical Ethics Opinion 3.2.1: Confidentiality, https://code-medical-ethics.ama-assn.org/ethics-opinions/confidentiality.

12 See P. Jon White & Jodi Daniel, Privacy and Security Solutions for Interoperable Health Information Exchange: Report on State Medical Record Access Laws (2009), www.healthit.gov/sites/default/files/290-05-0015-state-law-access-report-1.pdf (providing a multistate survey of various aspects of state medical records laws).

13 Furrow et al., supra note 9, at 161.

14 Jack M. Balkin, The Three Laws of Robotics in the Age of Big Data, 78 Ohio State L. J. 1217, 1221 (2017).

15 Scott R. Peppet, Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security, and Consent, 93 Tex. L. Rev. 85, 145 (2014).

16 Laura K. Donahue, The Fourth Amendment in a Digital World, 71 NYU Ann. Surv. Am. L. 553, 562–68 (2017).

17 Solove, supra note 2, at 1140.

18 Id.

19 Id. at 1109–12. See also Ferdinand David Schoeman, Privacy: Philosophical Dimensions of the Literature, in Philosophical Dimensions of Privacy: An Anthology 1, 3 (Ferdinand David Schoeman ed., 1984).

20 Paul M. Schwartz, Internet Privacy and the State, 32 Conn. L. Rev. 815, 820 (2000).

21 5 USC § 552(a) and (d); Priv. Prot. Study Comm’n, Personal Privacy in an Information Society 280 (1977), https://archive.epic.org/privacy/ppsc1977report/.

22 See, for example, Federal Policy for the Protection of Human Subjects of Biomedical Research (“Common Rule”), 45 CFR §§ 46.101–124 (2018); see, for example, Neil Richards, The Information Privacy Law Project, 94 Geo. L.J. 1087 (2006) and David Lyon, Surveillance Society: Monitoring Everyday Life, 33–35, 114–18 (2001).

23 Brooke Auxier et al., Americans’ Attitudes and Experiences with Privacy Policies and Laws, Pew Rsch. Ctr. (2019), www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/.

24 Anita L. Allen, Privacy-as-Data Control: Conceptual, Practical, and Moral Limits of the Paradigm, 32 Conn. L. Rev., 861, 867 (2000).

25 See Barbara J. Evans, The HIPAA Privacy Rule at Age 25: Privacy for Equitable AI, 50 Fla. State U. L. Rev., 781–82(2023) (citing investigative reports on dummy thermostats).

26 Gergely Biczók & Pern Hui Chia, Interdependent Privacy: Let Me Share Your Data, in Financial Cryptography and Data Security 338 (Ahmad-Reza Sadeghi ed., 2013).

27 Marwan K. Tayeh et al., The Designated Record Set for Clinical Genetic and Genomic Testing: A Points to Consider Statement of the American College of Medical Genetics and Genomics (ACMG), 25 Genet. Med. (2022).

28 Krystal S. Tsosie et al., Overvaluing Individual Consent Ignores Risks to Tribal Participants, 20 Nat. Revs. Genetics 497 (2019).

29 Cynthia Dwork et al., Calibrating Noise to Sensitivity in Private Data Analysis, in Theory of Cryptography. TCC 2006. Lecture Notes in Computer Science vol. 3876, 265 (S. Halevi & T. Rabin eds., 2006).

30 James J. Heckman, Selection Bias, in Encyclopedia of Social Measurement (2005).

31 Kayte Spector-Bagdady, Governing Secondary Research Use of Health Data and Specimens: The Inequitable Distribution of Regulatory Burden Between Federally Funded and Industry Research, 8 J. L. & Biosciences 1, 2–3 (2021); Reshma Jagsi et al., Perspectives of Patients with Cancer on the Ethics of Rapid-Learning Health Systems, 35 J. Clinical Oncology 2315, 2321 (2017); Christine L. M. Joseph et al., Demographic Differences in Willingness to Share Electronic Health Records in the All of Us Research Program, 29 J. Am. Med. Informatics Ass’n 1271 (2022).

32 US Gov’t Accountability Off., GAO-21-7SP, Artificial Intelligence in Health Care: Benefits and Challenges of Technologies to Augment Patient Care 24 (2020).

33 The White House Off. of Sci. & Tech. Pol’y, Blueprint for an AI Bill of Rights: Making Automated Systems Work for the American People 5, 26–27 (2022), www.whitehouse.gov/ostp/ai-bill-of-rights/.

34 Id.

35 See Brian Buckley et al., Selection Bias Resulting from the Requirement for Prior Consent in Observational Research: A Community Cohort of People with Ischaemic Heart Disease, 93 Heart 1116 (2007); Sharyl J. Nass et al. (eds.), Comm. on Health Rsch. & the Priv. of Health Info.: The HIPAA Priv. Rule, Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research 209–14 (2009), www.nap.edu/catalog/12458.html (surveying studies of consent and selection bias).

36 45 CFR pts. 160 and 164.

37 Neil M. Richards & Daniel J. Solove, Privacy’s Other Path: Recovering the Law of Confidentiality, 96 Geo. L.J. 123 (2007).

38 See supra notes 9–13 and accompanying text.

39 See 45 CFR § 160.102 (2018) (providing that the HIPAA regulations, including the Privacy Rule, apply to health care providers, such as physicians, clinics, hospitals, laboratories, and various other entities, such as insurers, that transmit “any health information in electronic form in connection with a transaction covered by this subchapter [the Administrative Simplification provisions of HIPAA]” and to their business associates); see also id. § 160.103 (defining the terms “covered entity” and “business associate”).

40 See Furrow et al., supra note 9.

41 See Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (2010); Helen Nissenbaum, Privacy as Conceptual Integrity, 79 Wash. L. Rev. 119 (2004); Adam Barth et al., Privacy and Contextual Integrity: Framework and Applications, in Proceedings of the 2006 IEEE Symposium on Security and Privacy 184 (2006).

42 See Evans, supra note 25, at 749–50, tbl. 1 (elaborating these norms). See also Letter from William W. Stead, Chair, Nat’l Comm. on Vital & Health Stat., to Hon. Sylvia M. Burwell, Secretary, U.S. Dep’t of Health & Hum. Servs. app. A at 15–19 (November 9, 2016), www.ncvhs.hhs.gov/wp-content/uploads/2013/12/2016-Ltr-Privacy-Minimum-Necessary-formatted-on-ltrhead-Nov-9-FINAL-w-sig.pdf (https://perma.cc/J7DF-X9VP).

43 45 CFR § 164.502(d) (2013); see 45 CFR § 160.103 (defining “protected health information” (PHI, the information that the HIPAA Privacy Rule protects) as “individually identifiable health information” and defining the term “health information” for the purposes of the HIPAA Privacy Rule). See 45 CFR§ 164.502(a)(1)(iv) (allowing PHI to be released with individual authorization). See also id. at § 164.508 (describing the requirements for a valid individual authorization, which is HIPAA’s term for a consent).

44 Evans, supra note 25, at 749–50, tbl. 1.

45 See Regulation 2016/679 of the European Parliament and of the Council of April 27, 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data and Repealing Directive 95/46/EC, OJ 2016 No. L 119, 1. See GDPR art. 6 (requiring consent for the processing of personal data, id. § 1(a), but allowing unconsented processing for various purposes such as legal compliance, “to protect the vital interests of the data subject or another natural person,” for tasks “carried out in the public interest,” see id. §§ 1(b)–(f), and allowing member states to specify provisions “to adapt the applications of the rules” in some of these circumstances). See GDPR art. 9 (addressing the processing of “special categories of personal data,” which include health data and requiring consent, id. § 2(a), but allowing member states to establish different conditions and safeguards for data used in “preventive or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment, or the management of health or social care systems and services,” id. § 2(h), and for public health, id. § 2(i), and for public interest purposes including scientific research, id. § 2(j)). See also GDPR art. 89 (allowing member state law to derogate from the various rights provided by the GDPR when those “rights are likely to render impossible or seriously impair the achievement” of various public-interest goals including scientific research).

46 Johan Hansen et al., Assessment of the EU Member States’ Rules on Health Data in the Light of GDPR, Eur. Comm’n, Specific Contract No. SC 2019 70 02 (in the context of the Single Framework Contract Chafea/2018/Health/03) (2021), https://health.ec.europa.eu/system/files/2021-02/ms_rules_health-data_en_0.pdf.

47 See 45 CFR §§ 160.202–.203 (Privacy Rule preemption provisions).

48 See Am. Health Laws. Ass’n, Health Law Practice Guide § 4:11 (2022).

49 See, for example, Pyramid Life Ins. Co. v. Masonic Hosp. Ass’n of Payne Cty., 191 F. Supp. 51 (W.D. Okla., 1961).

50 Am. Health Laws. Ass’n, supra note 54.

51 See Jessica L. Roberts, Progressive Data Ownership, 93 Notre Dame L. Rev. 1105, 1128 (2018) (citing five states’ genetic data ownership statutes).

52 See supra notes 9–13 and accompanying text.

53 See US Dep’t of Health & Hum. Servs., Direct Liability of Business Associates (July 16, 2021) www.hhs.gov/hipaa/for-professionals/privacy/guidance/business-associates/factsheet/index.html (discussing 2013 revisions to the HIPAA Privacy Rule).

54 See Jim Hawkins et al., Non-Transparency in Electronic Health Record Systems, in Transparency in Health and Health Care in the United States 273, 281 (Holly Fernandez Lynch et al. eds., 2019).

55 Charles Duhigg, How Companies Learn Your Secrets, The New York Times Magazine (February 16, 2012).

56 See supra note 45 (GDPR); California Consumer Privacy Act of 2018, Cal. Civ. Code § 1798.100–.199.

57 See generally Sandra Wachter & Brent Mittelstadt, A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI, 2019 Colum. Bus. L. Rev. 494 (2018).

58 Id. at 494–95.

59 See 45 CFR §§ 164.524 and .526.

2 Patient Access to Health Device Data Toward a Legal Framework

1 See, for example, Erin Brodwin, Remote Monitoring Is Rapidly Growing – and a New Class of Patient-Consumer Is Driving the Shift, STAT (September 16, 2020), www.statnews.com/2020/09/16/remote-patient-monitoring-stat-report/; Sarah Krouse, Covid-19 Pandemic Drives Patients – and Deal Makers – to Telemedicine, The Wall Street Journal (August 25, 2020), www.wsj.com/articles/covid-19-pandemic-drives-patients-to-telemedicine-deal-makers-too-11598358823.

2 Fortune Business Insights, Wearable Medical Devices Market Size Worth USD 195.57 Bn by 2027, GlobeNewswire (March 2, 2022), www.globenewswire.com/news-release/2022/02/03/2378221/0/en/Wearable-Medical-Devices-Market-Size-worth-USD-195-57-Bn-by-2027-With-stunning-26-4-CAGR.html.

3 I. Glenn Cohen, Sara Gerke, & Daniel B. Kramer, Ethical and Legal Implications of Remote Monitoring of Medical Devices, 98 Milbank Q. 1257, 1259 (2020).

4 See, for example, id. at 1266–67; John T. Wilbanks & Eric J. Topol, Stop the Privatization of Health Data, 535 Nature 345, 347 (2016); Elizabeth A. Rowe, Sharing Data, 104 Iowa L. Rev. 287 (2018).

5 Quinn Grundy et al., Tracing the Potential Flow of Consumer Data: A Network Analysis of Prominent Health and Fitness Apps, 19 J. Med. Internet Res. e233, at 4 (2017).

6 Smart Health: Empowering the Future of Mobile Applications, Hearing Before the Subcomm. on Rsch. & Tech. of the H. Comm. on Sci., Space and Tech., 114th Cong. 43–44 (2016) (testimony of Howard Look).

7 Jason Koebler, Why Sleep Apnea Patients Rely on a CPAP Machine Hacker, Vice News (November 15, 2018), www.vice.com/en/article/xwjd4w/im-possibly-alive-because-it-exists-why-sleep-apnea-patients-rely-on-a-cpap-machine-hacker.

8 See, for example, press release, US Dep’t of Health and Human Svcs. (HHS), Five Enforcement Actions Hold Healthcare Providers Accountable for HIPAA Right of Access (November 30, 2021), www.healthit.gov/sites/default/files/non-covered_entities_report_june_17_2016.pdf (on HHS Office of Civil Rights’ HIPAA Right of Access Initiative).

9 HHS, Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA (2020), https://perma.cc/2JZU-DQJF.

10 See generally Charlotte Blease, I. Glenn Cohen, & Sharon Hoffman, Sharing Clinical Notes: Potential Medical-Legal Benefits and Risks, 327(8) JAMA 717 (2022). For example, the US Copyright Office has observed that people with sleep apnea use “CPAP machine data to adjust their machines and enhance their treatment and health.” US Copyright Office, Section 1201 Rulemaking: Eighth Triennial Proceeding to Determine Exemptions to the Prohibition on Circumvention 143 (October 2021) [hereinafter Eighth Triennial], https://cdn.loc.gov/copyright/1201/2021/2021_Section_1201_Registers_Recommendation.pdf. Patients cannot always “rely on the data directly provided on the machines’ displays because the algorithms in CPAP machines could provide inaccurate readings.” Id.

11 See, for example, Sharona Hoffman, Access to Health Records: New Rules Another Step in the Right Direction, JURIST (February 20, 2019), www.jurist.org/commentary/2019/02/sharona-hoffman-health-records-proposal/.

12 See Fed. Trade Comm’n, Nixing the Fix: An FTC Report to Congress on Repair Restrictions 41–42 (2021), www.ftc.gov/reports/nixing-fix-ftc-report-congress-repair-restrictions.

13 Mary A. Majumder & Amy L. McGuire, Data Sharing in the Context of Health-Related Citizen Science, 48 J.L. Med. & Ethics 167 (2020); Sharona Hoffman, Citizen Science: The Law and Ethics of Public Access to Medical Big Data, 30 Berkeley Tech. L.J. 1741, 1755 (2015).

14 See Melanie Swan, The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery, 1 Big Data 85, 91–92 (2013).

15 See Wilbanks & Topol, supra note 4.

16 See Melissa J. Landrum & Brandi L. Kattman, ClinVar at Five Years: Delivering on the Promise, 39 Hum. Mutation 1623, 1625 (2018); ClinVar Submissions, Nat’l Lib. Med. (last visited April 19, 2022), www.ncbi.nlm.nih.gov/clinvar/submitters/.

17 Katherine J. Strandburg, Brett M. Frischmann, & Michael J. Madison, The Knowledge Commons Framework, in Governing Medical Knowledge Commons 9 (Katherine J. Strandburg, Brett M. Frischmann, & Michael J. Madison eds., 2017).

18 See, for example, Jorge L. Contreras, Leviathan in the Commons: Biomedical Data and the State, in Governing Medical Knowledge Commons 19 (Katherine J. Strandburg, Brett M. Frischmann, & Michael J. Madison eds., 2017) (on government’s role in fostering public medical databases); Critical Path Inst., Rare Disease Cures Accelerator-Data and Analytics Platform, https://c-path.org/programs/rdca-dap/ (exemplary FDA-funded effort).

19 See Rowe, supra note 4, at 313.

20 Sanket S. Dhruva et al., Real-World Evidence: Promise and Peril for Medical Product Evaluation, 43 PT 464, 469 (2018).

21 See, for example, Barbara J. Evans, Genomic Data Commons, in Governing Medical Knowledge Commons 74, 81 (Katherine J. Strandburg, Brett M. Frischmann, & Michael J. Madison eds., 2017) (on the “data access challenge”).

22 See Eric von Hippel, Democratizing Innovation 77–91 (2005).

23 David Blumenthal, A Big Step Toward Giving Patients Control over Their Health Care Data, Harvard Business Review (March 15, 2019), https://hbr.org/2019/03/a-big-step-toward-giving-patients-control-over-their-health-care-data.

24 See Wilbanks & Topol, supra note 4.

25 By “access” to their own data, we mean not just patients’ ability to view their own data, but also their ability to download it, to archive it, and to share it.

26 See 45 CFR § 164.524(c)(4) (providing for a “reasonable, cost-based fee” for patient data access under the HIPAA).

27 See Cohen et al., supra note 3, at 1282–83.

28 FDA Issues New Alert on Medtronic Insulin Pump Security, Healthcare IT News (July 1, 2019), www.healthcareitnews.com/news/fda-issues-new-alert-medtronic-insulin-pump-security; Joe Carlson, FDA Says Pacemakers, Glucose Monitors and Other Devices Could Be Vulnerable to Hackers, Star Tribune (March 3, 2020), www.startribune.com/fda-says-pacemakers-glucose-monitors-and-other-devices-could-be-vulnerable-to-hackers/568452772/.

29 Joseph Cox, How the US Military Buys Location Data from Ordinary Apps, Vice News (November 16, 2020), www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x; Alfred Ng & Jon Keegan, Who Is Policing the Location Data Industry?, The Markup (February 24, 2022), https://themarkup.org/ask-the-markup/2022/02/24/who-is-policing-the-location-data-industry.

30 See, for example, Larry Magid, Devices Measure Quantity, Quality of Sleep, Mercury News (December 21, 2018), www.mercurynews.com/2018/12/20/magid-devices-measure-quantity-quality-of-sleep/.

31 To be sure, patient access to these types of information would be useful in some situations, such as testing the reliability of manufacturers’ invented health “scores.” The nature of proprietary rights over device source code and manufacturer-specific computed data is an important area for further research.

32 See, for example, Timo Minssen & Justin Pierce, Big Data and Intellectual Property Rights in the Health and Life Sciences, in Big Data, Health Law, and Bioethics 307 (I. Glenn Cohen et al. eds., 2018); Rowe, supra note 4, at 299–301 (2018); Comments of AdvaMed and Medical Imaging and Technology Alliance opposing the 1201 exemption at 5 (2015), https://cdn.loc.gov/copyright/1201/2015/comments-032715/class%2025/AdvaMed_Class25_1201_2014.pdf [hereinafter AdvaMed-MITA 2015]. Cf. Med. Imaging & Tech. All. v. Libr. of Cong., no. 1:22-cv-00499 (DDC filed February 25, 2022) (ongoing litigation alleging, inter alia, that the US Copyright Office violates copyright law by authorizing repair personnel to circumvent technical “locks” on health devices) [hereinafter MITA litigation].

33 See Feist Publ’ns, Inc. v. Rural Tel. Serv. Co., 499 US 340, 345 (1991); US Copyright Office, Section 1201 Rulemaking: Sixth Triennial Proceeding to Determine Exemptions to the Prohibition on Circumvention 393 (October 2015). See also, for example, Midler v. Ford, 849 F.2d 460 (9th Cir. 1988) (holding voices uncopyrightable); US Copyright Office, in re Second Request for Reconsideration for Refusal To Register Equilibrium (2020), www.copyright.gov/rulings-filings/review-board/docs/equilibrium.pdf, at 5 (concluding fingerprints are uncopyrightable).

34 See Eighth Triennial, supra note 10.

35 Id. But see MITA litigation, supra note 32 (alleging that the US Copyright Office erred in permitting repair personnel to do so).

36 See, for example, 18 USC 1839(3)(B) (federal definition); UTSA § 1.4 (definition common in state law).

37 Id.

38 Camilla Alexandra Hrdy, The Value in Secrecy, 91 Fordham L. Rev. 557, 596 (2022).

39 Id. See also, for example, Yield Dynamics, Inc. v. TEA Sys. Corp., 154 Cal. App. 4th 547, 561 n.13, 564–65, 566–67 (2007) (holding a company’s software not a trade secret, despite secrecy and economic value, because the software was built on a combination of open-source and secret code and the company had not proven that economic value derived from continued secrecy).

40 See, for example, AdvaMed-MITA 2015, supra note 32, at 5–6 (asserting trade secret rights in the source code in medical devices).

41 Eur. Med. Agency, External Guidance on the Implementation of the European Medicines Agency Policy on the Publication of Clinical Data for Medicinal Products for Human Use (2018) [hereinafter EMA], https://perma.cc/28UL-6ZQK.

42 Id. at 52.

43 Id. at 54.

44 Eur. Med. Agency, Policy on Publication of Clinical Data for Medicinal Products for Human Use Annex 3 (2019) [hereinafter EMA 2019], www.ema.europa.eu/en/documents/other/european-medicines-agency-policy-publication-clinical-data-medicinal-products-human-use_en.pdf; Regulation 536/2014, of the European Parliament and of the Council of April 16, 2014 on Clinical Trials on Medicinal Products for Human Use and Repealing Council Directive 2001/20/EC Text with EEA relevance, O.J. (L 158) 1, 1–76.

45 EMA 2019, supra note 44, at Annex 3.

46 EMA, supra note 41, at 58. The NIH apparently shares the EMA’s view. See 81 Fed. Reg. 64,982, 64,996–97 (stating that “trial results in summary form” “can be provided without disclosing trade secret or confidential commercial information”).

47 Manufacturers tend to emphasize the policy argument that innovation could suffer without strengthened intellectual property protection of some sort – perhaps acknowledging that existing doctrine does not prohibit patients from accessing patient data. See, for example, 2015 comments of AdvaMed opposing the 1201 exemption, https://cdn.loc.gov/copyright/1201/2015/comments-032715/class%2027/AdvaMed_Class27_1201_2014.pdf, at 7 (asserting vaguely that patient access “poses trade secrecy concerns” while insisting “trade secrets may be the only viable form of protection for companies conducting research and development in this area”).

48 See Hrdy, supra note 38, at 7–8 (discussing “type failures”); Sharon Sandeen, Out of Thin Air: Trade Secrets, Cybersecurity, and the Wrongful Acquisition Tort, 19 Minn. J.L. Sci. & Tech. 373 (2018); Amy Kapczynski, The Public History of Trade Secrets, U.C. Davis L. Rev. 1367, 1429–36 (2022).

49 Ruckelshaus v. Monsanto Co., 467 U.S. 986, 1011 n.15 (1984). See also Pub. Citizen Health Rsch. Grp. v. FDA, 704 F.2d 1280, 1291 n.30 (D.C. Cir. 1983).

50 45 CFR § 164.524.

51 See, for example, Cal. Civ. Code § 1798.100(a); GDPR art. 15.

52 Jennifer J. Hennessy et al., HIPAA Right of Access Initiative: 2020 Year in Review, The National Law Review (December 11, 2020), www.natlawreview.com/article/hipaa-right-access-initiative-2020-year-review.

53 Carolyn T. Lye et al., Assessment of US Hospital Compliance with Regulations for Patients’ Requests for Medical Records, 1 JAMA Netw. Open e183014 (2018).

54 Centers for Medicare and Medicaid Services, Medicare and Medicaid Programs, Electronic Health Record Incentive Program Final Rule, 75 Fed. Reg. 44,314 (July 28, 2010).

55 HHS Office of the Nat’l Coordinator for Health Info. Tech. (ONC), HealthIT Quick Stat #61: National Trends in Hospital and Physician Adoption of Electronic Health Records, www.healthit.gov/data/quickstats/national-trends-hospital-and-physician-adoption-electronic-health-records. (“As of 2019, about three-quarters of office-based physicians (72%) and nearly all non-federal acute care hospitals (96%) had adopted a certified EHR.”)

56 Id.

57 21st Century Cures Act Final Rule, 85 Fed. Reg. 25642 (May 1, 2020) (codified at 45 CFR pts. 170, 171).

58 See Karen E. Wain et al., The Value of Genomic Variant ClinVar Submissions from Clinical Providers: Beyond the Addition of Novel Variants, 39 Hum. Mutation 1660, 1661 (2018).

59 Protecting Personal Health Data Act, S. 24, 117th Cong. (2021); press release, Klobuchar, Murkowski Introduce Legislation to Protect Consumers’ Private Health Data (February 2, 2021), www.klobuchar.senate.gov/public/index.cfm/2021/2/klobuchar-murkowski-introduce-legislation-to-protect-consumers-private-health-data.

60 S. 24, supra note 59.

61 See Tex. Health & Safety Code Ann. §§ 181.001(b)(2)(A) (defining a “covered entity” under Texas law).

62 Jonathan Deitch, Protecting Unprotected Data in Mhealth, 18 Nw. J. Tech & Intell. Prop. 107 (2020); see also Cohen et al., supra note 3, at 1276.

63 HHS ONC, Conceptualizing a Data Infrastructure for the Capture, Use, and Sharing of Patient-Generated Health Data in Care Delivery and Research Through 2024 23 (January 2018), www.healthit.gov/sites/default/files/onc_pghd_final_white_paper.pdf.

64 US Food and Drug Admin., General Wellness: Policy for Low-Risk Devices 2 (September 26, 2019), www.fda.gov/media/90652/download.

65 See Dov Greenbaum, Avoiding Overregulation in the Medical Internet of Things, in Big Data, Health Law, and Bioethics 129, 138 (I. Glenn Cohen et al. eds., 2018).

3 Challenges of Remote Patient Care Technologies under the General Data Protection Regulation Preliminary Results of the TeNDER Project

1 See generally TeNDER Health – TeNDER Project, www.tender-health.eu/. Disclaimer: This research has been funded by the European Commission under the Horizon 2020 mechanism – grant no. 875325 (TeNDER, affecTive basEd iNtegrateD carE for betteR Quality of Life).

2 Eur. Parliamentary Rsch. Serv., The Rise of Digital Health Technologies During the Pandemic (2021), www.europarl.europa.eu/RegData/etudes/BRIE/2021/690548/EPRS_BRI(2021)690548_EN.pdf.

3 Craig E. Kuziemsky et al., Ethics in Telehealth: Comparison between Guidelines and Practice-based Experience – The Case for Learning Health Systems, 29 Y.B. Med. Informatics 44 (2020).

4 Alexandra Queirós et al., Remote Care Technology: A Systematic Review of Reviews and Meta-Analyses, 6 Technologies 22 (2018).

5 Caregility Team, The Difference Between Remote Patient Monitoring and Telehealth, https://caregility.com/blog/the-difference-between-remote-patient-monitoring-and-telehealth/.

6 Eur. Parliamentary Rsch. Serv., supra note 2.

7 Queirós et al., supra note 4.

8 Ana Isabel Martins et al., Ambient Assisted Living: Introduction and Overview, in Usability, Accessibility and Ambient Assisted Living 1 (Alexandra Queirós & Nelson Pacheco da Rocha eds., 2018).

9 Benedict Stanberry, Telemedicine: Barriers and Opportunities in the 21st Century, 247 J. of Internal Med. 615 (2000).

10 I. Glenn Cohen et al., Ethical and Legal Implications of Remote Monitoring of Medical Devices, 98 Milbank Q. 1257 (2020).

11 S. Stowe & S. Harding, Telecare, Telehealth and Telemedicine, 1 Eur. Geriatric Med. 193 (2010).

12 Regulation (EU) 2016/679 of the European Parliament and of the Council of April 27, 2016 on the Protection of Natural Persons with regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (GDPR) (text with EEA relevance), 2016 O.J. (L 119) 1, http://data.europa.eu/eli/reg/2016/679/oj/eng.

13 Article 29 Working Party, Eur. Commn’, Working Document on the Processing of Personal Data Relating to Health in Electronic Health Records (EHR) (2007), https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2007/wp131_en.pdf.

14 Article 29 Working Party, Eur. Commn’, Opinion 05/2014 on Anonymisation Techniques (2014), https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/index_en.htm.

15 Case C-210/16, Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH, interveners: Facebook Ireland Ltd, Vertreter des Bundesinteresses beim Bundesverwaltungsgericht, ECLI:EU:C:2018:388 (June 5, 2018).

16 Case C-40/17, Fashion ID GmbH & Co. KG v. Verbraucherzentrale NRW eV, interveners: Facebook Ireland Ltd, Landesbeauftragte für Datenschutz und Informationsfreiheit Nordrhein-Westfalen, ECLI:EU:C:2019:629 (July 29, 2019).

17 Eur. Data Prot. Bd., Guidelines 07/2020 on the Concepts of Controller and Processor in the GDPR (2020), https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-072020-concepts-controller-and-processor-gdpr_en.

18 Ann Cavoukian, International Council on Global Privacy and Security, By Design, 35 IEEE Potentials 43 (2016).

19 TeNDER Health – How TeNDER Works, www.tender-health.eu/project/how-tender-works/.

20 TeNDER, D1.1 “First Version of Fundamental Rights, Ethical and Legal Implications and Assessment” (2020), www.tender-health.eu/project/.here-you-can-find-a-selection-of-the-projects-public-deliverables-as-they-become-available/.

21 TeNDER, D1.4, “First version Legal/Ethical Monitoring and Review” (2021), www.tender-health.eu/project/here-you-can-find-a-selection-of-the-projects-public-deliverables-as-they-become-available/.

22 TeNDER, D1.6, “Final Version of Fundamental Rights, Ethical and Legal Implications and Assessment” (2023), www.tender-health.eu/project/here-you-can-find-a-selection-of-the-projects-public-deliverables-as-they-become-available/.

23 Danaja Fabcic Povse, Fragmented eHealth Regulation in the EU TeNDER (2022), www.tender-health.eu/fragmented-ehealth-regulation-in-the-eu/.

24 Eur. Data Protection Bd., Guidelines 05/2020 on Consent under Regulation 2016/679 version 1.1 (2020), https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_202005_consent_en.pdf.

25 World Med. Ass’n, WMA Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Subjects (1964), www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/; Council of Eur., Recommendation No. R(99)4 of the Committee of Ministers to Member States on Principles Concerning the Legal Protection of Incapable Adults (1999), www.coe.int/t/dg3/healthbioethic/texts_and_documents/Rec(99)4E.pdf.

26 Alzheimer Eur., Understanding Dementia Research, www.alzheimer-europe.org/research/understanding-dementia-research.

27 Kathryn C. Montgomery et al., Ctr. for Digit. Democracy, Health Wearable Devices in the Big Data Era: Ensuring Privacy, Security, and Consumer Protection (2016), www.democraticmedia.org/sites/default/files/field/public/2016/aucdd_wearablesreport_final121516.pdf.

28 Id.; T. Mulder & M. Tudorica, Privacy Policies, Cross-Border Health Data and the GDPR, 28 Info. & Commc’n Tech. L. 261 (2019).

29 TeNDER, supra note 21.

30 Danielle Kosecki, 13 Fitbit Community Features You Can Customize for More (or Less!) Privacy, Fitbit News (2017), https://blog.fitbit.com/fitbit-privacy-settings/; Danielle Kosecki, Ask Fitbit: How Can I Keep My Stats Private?, Fitbit News (2017), https://blog.fitbit.com/go-incognito/.

31 Nor. Consumer Council, Consumer Protection in Fitness Wearables (2016), https://fil.forbrukerradet.no/wp-content/uploads/2016/11/2016-10-26-vedlegg-2-consumer-protection-in-fitness-wearables-forbrukerradet-final-version.pdf; Eur. Data Protection Bd., Guidelines 4/2019 on Article 25: Data Protection by Design and by Default version 2.0 (2020), https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_201904_dataprotection_by_design_and_by_default_v2.0_en.pdf.

32 Eur. Data Protection Bd., Guidelines 3/2019 on Processing of Personal Data Through Video Devices version 2.0 (2020), https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_201903_video_devices_en_0.pdf.

33 Id.

34 Article 29 Working Party, Eur. Commn’, Opinion 06/2014 on the Notion of Legitimate Interests of the Data Controller Under Article 7 of Directive 95/46/EC (2014), https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf.

35 Studio Legale Stefanelli & Stefanelli, Artificial Intelligence, Medical Devices and GDPR in Healthcare: Everything You Need to Know About the Current Legal Frame, Lexology (2022) www.lexology.com/library/detail.aspx?g=8cba1347-0323-4951-b9b5-69015f6e169f.

36 Eur. Comm’n, Commission Recommendation of 6.2.2019 on a European Electronic Health Record exchange format C (2019) 800 final.

37 Article 29 Working Party, supra note 13.

38 Id.

39 TeNDER, D5.3, First Report on the Health Record and Pathway Gathering (2021), www.tender-health.eu/project/here-you-can-find-a-selection-of-the-projects-public-deliverables-as-they-become-available/.

40 TeNDER, D1.6, “Final Version of Fundamental Rights, Ethical and Legal Implications and Assessment” (2023), www.tender-health.eu/project/here-you-can-find-a-selection-of-the-projects-public-deliverables-as-they-become-available/.

4 Renegotiating the Social Contract for Use of Health Information Lessons Learned from Newborn Screening and Implications for At-Home Digital Care

1 Cue, What Is the Cue Health Monitoring System? (November 20, 2022), https://cuehealth.com/products/.

2 Lynn M. Ethredge, A Rapid-Learning Health System, 26 Health Affairs W107–18 (2007).

3 Francis S. Collins et al., A Vision for the Future of Genomics Research, 6934 Nature 422, 835–47 (2003).

4 Jeremy Sugarman, Ethics and Regulatory Challenges and Opportunities in Patient-Centered Comparative Effectiveness Research, 4 Acad. Med.: J. Ass’n American Med. Colls. 91, 455–57 (2016).

5 David Altshuler, Mark J. Daly, & Eric S. Lander, Genetic Mapping in Human Disease, 5903 Science 322, 881–88 (2008).

6 Helen Swede, Carol L. Stone, and Alyssa R. Norwood, National Population-Based Biobanks for Genetic Research, 3 Genetics in Med. 9, 141–49 (2007).

7 Daniel B. Thiel et al., Community Perspectives on Public Health Biobanking: An Analysis of Community Meetings on the Michigan BioTrust for Health, 2 J. Cmty. Genetics 5, 125–38 (2014).

8 Id.

9 J.E. Platt et al., “Born in Michigan? You’re in the Biobank”: Engaging Population Biobank Participants through Facebook Advertisements, 4 Pub. Health Genomics 16, 145–58 (2013).

10 Ann Mongoven et al., Negotiating Deliberative Ideals in Theory and Practice: A Case Study in “Hybrid Design,” 1 J. Deliberative Democracy 12 (2016).

11 Michigan State University Institute for Public Policy and Social Research, State of the State Survey 63 (Fall 2012) (2012), http://ippsr.msu.edu/soss/; Michigan State University Institute for Public Policy and Social Research, State of the State Survey 66 (Fall 2013) (2013), http://ippsr.msu.edu/soss/; Michigan State University Institute for Public Policy and Social Research, State of the State Survey 67 (Winter 2014) (2014), http://ippsr.msu.edu/soss/; Daniel B. Thiel et al., Testing an Online, Dynamic Consent Portal for Large Population Biobank Research, 1 Pub. Health Genomics 18, 26–39 (2015).

12 Platt et al., supra note 9.

13 Id.; Thiel et al., supra note 12; Tevah Platt et al., Engaging a State: Facebook Comments on a Large Population Biobank, 3 J. Cmty. Genetics 8, 183–97 (2017).

14 Jodyn Platt et al., Public Preferences Regarding Informed Consent Models for Participation in Population-Based Genomic Research, 16 Genetics in Med. 1, 11–18 (2014).

15 Camille Nebeker, John Torous, & Rebecca J. Bartlett Ellis, Building the Case for Actionable Ethics in Digital Health Research Supported by Artificial Intelligence, 17 BMC Med. 1, 137 (2019).

16 Cue, Cue® Health Privacy Policy (November 20, 2022), https://cuehealth.com/about/data-and-privacy/us/privacy-policy/.

17 Jennifer Couzin-Frankel, Science Gold Mine, Ethical Minefield, 5924 Science 324, 166–68 (2009).

18 Kayte Spector-Bagdady et al., Encouraging Participation and Transparency in Biobank Research, 8 Health Affairs 37, 1313–20 (2018).

19 Timothy Caulfield et al., A Review of the Key Issues Associated with the Commercialization of Biobanks, 1 J. Law Biosciences 1, 94–110 (2014); Christine Critchley, Dianne Nicol, & Margaret Otlowski, The Impact of Commercialisation and Genetic Data Sharing Arrangements on Public Trust and the Intention to Participate in Biobank Research, 3 Pub. Health Genomics 18, 160–72 (2015).

20 Ellen Matloff, Your Baby’s Newborn Screening Blood Sample Could Be Used To Convict You Of A Crime. It Just Happened In New Jersey, Forbes (November 21, 2022), www.forbes.com/sites/ellenmatloff/2022/09/22/your-babys-newborn-screening-blood-sample-could-be-used-to-convict-you-of-a-crime-it-just-happened-in-new-jersey/.

21 Daisuke Wakabayashi, Google and the University of Chicago Are Sued Over Data Sharing, The New York Times (June 26, 2019), www.nytimes.com/2019/06/26/technology/google-university-chicago-data-sharing-lawsuit.html.

22 Charles Ornstein & Katie Thomas, Sloan Kettering’s Cozy Deal with Start-Up Ignites a New Uproar, The New York Times (September 20, 2018), www.nytimes.com/2018/09/20/health/memorial-sloan-kettering-cancer-paige-ai.html.

23 Memorial Sloan Kettering Cancer Center, Memorial Sloan Kettering and Paige.AI (November 20, 2022), www.mskcc.org/news-releases/msk-and-paige-ai.

24 Amy Harmon, Indian Tribe Wins Fight to Limit Research of Its DNA, The New York Times (April 21, 2010), www.nytimes.com/2010/04/22/us/22dna.html.

25 Rex Dalton, When Two Tribes Go to War, 6999 Nature 430, 500–502 (2004).

26 Neil A. Holtzman & Michael S. Watson (eds.) Promoting Safe and Effective Genetic Testing in the United States. Task Force on Genetic Testing. National Institutes of Health-Department of Energy (1997), www.genome.gov/10001733/genetic-testing-report.

27 Daniel B. Thiel et al., Community Perspectives on Public Health Biobanking: An Analysis of Community Meetings on the Michigan BioTrust for Health, 2 J. Cmty. Genetics 5, 125–38 (2014).

28 Richard Hughes IV, Spreeha Choudhury, & Alaap Shah, Newborn Screening Blood Spot Retention And Reuse: A Clash Of Public Health And Privacy Interests, Health Affairs Forefront (November 20, 2022), https://doi.org/10.1377/forefront.20221004.177058.

29 Emily Ramshaw, DNA Deception, The Texas Tribune (February 22, 2010), www.texastribune.org/2010/02/22/dshs-turned-over-hundreds-of-dna-samples-to-feds/.

30 Chandra L. Ford & Collins O. Airhihenbuwa, The Public Health Critical Race Methodology: Praxis for Antiracism Research, 8 Social Science & Med. 71, 1390–98 (2010).

Figure 0

Table 3.1 Essential data protection requirements for RCTs: Preliminary results of TeNDER

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×