To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Diverse and increasingly comprehensive data about our personal lives is collected. When these personal data are linked to health records or linked to other data collected in our environment, such as that collected by state administrations or financial systems, the data have huge potential for public health research and society in general. Precision medicine, including pharmacogenomics, particularly depends on the potential of data linkage. With new capacities to analyze linked data, researchers today can retrieve and assess valuable and clinically relevant information. One way to develop such linked data sets and to make them available for research is through health data cooperatives. An example of such a health data cooperation is MIDATA – a health data cooperative recently established in Switzerland and the main focus of this chapter. In response to concerns about the present health data economy, MIDATA was founded to provide a governance structure for data storage that supports individual’s digital self-determination by allowing MIDATA members to control their own personal data flow and to store such data in a secure environment.
The rise of social media has raised questions about the vitality of privacy values and concerns about threats to privacy. The convergence of politics with social media use amplifies the privacy concerns traditionally associated with political organizing, particularly when marginalized groups and minority politics are involved. Despite the importance of these issues, there has been little empirical exploration of how privacy governs political activism and organizing in online environments. This chapter explores how privacy concerns shape political organizing on Facebook, through detailed case studies of how groups associated with March for Science, Day Without Immigrants (“DWI”), and Women’s March govern information flows. These cases address distinct issues, while operating in similar contexts and on the same timescales, allowing for the exploration of privacy in governance of personal information flows in political organizing and Facebook sub-communities. Privacy practices and concerns differed between the cases, depending on factors such as the nature of the group, the political issues it confronts, and its relationships to other organizations or movements.
Privacy has traditionally been conceptualized in an individualistic framing, often as a private good that is traded off against other goods. This chapter views the process of privacy enforcement through the lens of governance and situated design of sociotechnical systems. It considers the challenges in formulating and designing privacy as commons (as per the Governing Knowledge Commons framework) when privacy ultimately gets enacted (or not) in complex sociotechnical systems. It identifies six distinct research directions pertinent to the governance and formulation of privacy norms, spanning an examination of how tools of design could be used to develop design strategies and approaches to formulate, design, and sustain a privacy commons, and how specific technical formulations and approaches to privacy can serve the governance of such a privacy commons.
The Internet of Everything takes the notion of IoT a step further by including not only the physical infrastructure of smart devices, but also its impacts on people, business, and society. Our world is getting more connected, if not smarter, but to date governance regimes have struggled to keep pace with this dynamic rate of innovation. Yet it is an open question whether security and privacy protections can or will scale within this dynamic and complex global digital ecosystem, and whether law and policy can keep up with these developments? The natural question, then, is whether our approach to governing the Internet of Everything is, well, smart? This chapter explores what lessons the Institutional Analysis and Development (IAD) and Governing Knowledge Commons (GKC) Frameworks hold for promoting security, and privacy, in an Internet of Everything, with special treatment regarding the promise and peril of blockchain technology to build trust in such a massively distributed network. Particular attention is paid to governance gaps in this evolving ecosystem, and what state, federal, and international policies are needed to better address security and privacy failings.
Understanding the rules and norms that shape the practices of institutional researchers and other data practitioners in regards to student data privacy within higher education could be researched using descriptive methods, which attempt to illustrate what is actually being done in this space. But, we argue that it is also important for practitioners to become reflexive about their practice while they are in the midst of using sensitive data in order to make responsive practical and ethical modulations. To achieve this, we conducted a STIR, or socio-technical integration research. We see in the data, the STIR of a single institutional researcher, some evidence of changes in information flow, reactions to it, and ways of thinking and doing to reestablish privacy-protecting rules-in-use.
Personal information is inherently about someone, is often shared unintentionally or involuntarily, flows via commercial communication infrastructure, and can be instrumental and often essential to building trust among members of a community. As a result, privacy commons governance may be ineffective, illegitimate, or both if it does not appropriately account for the interests of information subjects or if infrastructure is owned and designed by actors whose interests may be misaligned or in conflict with the interests of information subjects. Additional newly emerging themes include the importance of trust; the contestability of commons governance legitimacy; and the co-emergence of contributor communities and knowledge resources. The contributions in this volume also confirm and deepen insights into recurring themes identified in previous GKC studies, while the distinctive characteristics of personal information add nuance and uncover limitations. The studies in this volume move us significantly forward in our understanding of knowledge commons, while opening up important new directions for future research and policy development, as discussed in this concluding chapter.
This introduction to Governing Privacy in Knowledge Commons discusses how meta-analysis of past case studies has yielded additional questions to supplement the GKC framework, based on the specific governance challenges around personal information. Based on this renewed understanding, a series of new case studies are organized around the different roles that personal information play in commons arrangements. The knowledge commons perspective highlights the interdependence between knowledge flows aimed at creative production and personal information flows. Madelyn will discuss how those who systematically study knowledge commons governance with an eye toward knowledge production routinely encounter privacy concerns and values, along with rules in use that govern appropriate personal information flow.
Drawing upon the GKC framework, this chapter presents an ethnographic study of Woebot – a therapy chatbot designed to administer a form of cognitive behavioral therapy (“CBT”). Section 3.1 explains the methodology of this case study. Section 3.2 describes the background contexts that relate to anxiety as a public health problem. These include the nature of anxiety and historical approaches to diagnosing and treating it, the ascendency of e-Mental Health therapy provided through apps, and relevant laws and regulations. Section 3.3 describes how Woebot was developed and what goals its designers pursued. Section 3.4 describes the kinds of information that users share with Woebot. Section 3.5 describes how the designers of the system seek to manage this information in a way that benefits users without disrupting their privacy.
This chapter begins where Simmel and many other social and legal scholars left off. In contrast to many traditional theories of privacy, we argue, as one of us has argued before, that privacy rules and norms are essential to social interaction and generativity. Through primary source research, we suggest that the rules and norms governing information privacy in three knowledge creation contexts – Chatham House, Gordon Research Conferences (“GRC”), and the Broadband Internet Technical Advisory Group (“BITAG”) – are necessary to develop the kind of trust that is essential for sharing ideas, secrets, and other information. More specifically, when it is part of institutional structures governing knowledge commons, privacy fosters knowledge through a systematic social process. Privacy rules have expressive effects that embed confidentiality norms in the background of institutional participation, which in turn create a sense of community among participants that can both bring in new members and threaten sanctions for misbehavior. Knowledge production, therefore, depends on privacy.
Conceptualizing privacy as information flow rules-in-use constructed within a commons governance arrangement, we adapt the Governing Knowledge Commons (GKC) framework to study the formal and informal governance of information flows. We incorporate Helen Nissenbaum's “privacy as contextual integrity” approach, defining privacy in terms of contextually appropriate flows of personal information. While Nissenbaum's framework treats contextual norms as largely exogenous and emphasizes their normative valence, the GKC framework provides a systematic method to excavate personal information rules-in-use that actually apply in specific situations and interrogate governance mechanisms that shape rules-in-use. After discussing how the GKC framework can enrich privacy research, we explore empirical evidence for contextual integrity as governance within the GKC framework through meta-analysis of previous knowledge commons case studies, revealing three governance patterns within the observed rules-in-use for personal information flow. Our theoretical analysis provides strong justification for a new research agenda using the GKC framework to explore privacy as governance.
The knowledge commons framework, deployed here in a review of the early network of scientific communication known as the Republic of Letters, combines a historical sensibility regarding the character of scientific research and communications with a modern approach to analyzing institutions for knowledge governance. Distinctions and intersections between public purposes and privacy interests are highlighted. Lessons from revisiting the Republic of Letters as knowledge commons may be useful in advancing contemporary discussions of Open Science.