Skip to main content
×
×
Home

Race and Rights in the Digital Age

  • Catherine Powell (a1)
Extract

This essay discusses how, despite the liberatory potential of technology, racial bias pervades the digital space. This bias creates tension with both the formal, de jure equality notion of “colorblindness” (in U.S. constitutional law) as well as the broader, substantive, de facto equality idea (in international human rights law). The essay draws on the work of Osagie Obasogie to show how blind people perceive race in the same way as sighted people, despite not being able to see race. It then uses blindness as a metaphor to explore how race is seen and not seen online, and analyzes the implications of this for human rights.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Race and Rights in the Digital Age
      Available formats
      ×
      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Race and Rights in the Digital Age
      Available formats
      ×
      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Race and Rights in the Digital Age
      Available formats
      ×
Copyright
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Footnotes
Hide All
*

I would like to thank the participants in the International Law and Race Workshop at University of Colorado Law School, August 3-4, 2018, as well as Danielle Citron, Clare Huntington, Margot Kaminski, Craig Konneth, Olivier Sylvain, and Ari Ezra Waldman for valuable input. I'm also grateful to my superb research assistants, Mary Katherine Cunningham, Christine Gabrellian, Aryian Kohandel-Shirazi, and Mindy Nam.

Footnotes
References
Hide All

1 While laudable in theory, in practice, the U.S. colorblindness doctrine has been used as a cover for law to ignore the ongoing effects of America's long legacy of slavery and Jim Crow. Fortunately, some U.S. civil rights statutes go beyond this narrow formal approach to adopt the broader substantive approach found in international human rights law.

3 See Universal Declaration of Human Rights art. 1, G.A. Res. 217A (III) (Dec. 8, 1948) [hereinafter UDHR].

5 This essay focuses on the “narrow” AI that currently exists, which utilizes machine learning algorithms that harvest personal data to detect patterns and make predictions about individuals based on those patterns. Past data may reflect underlying discriminatory patterns (such as in past policing practices), which imports preexisiting bias into making predictions about the future. See, e.g., Cathy O'Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2016).

6 See, e.g., Tim Jordan, Cyberpower: The Culture and Politics of Cyberspace and the Internet 66 (1999) (discussing the belief “that cyberspace will be liberatory because ‘race’ will be absent there”); Lisa Nakamura, Cybertypes: Race, Ethnicity and Identity on the Internet (2002) (critiquing the notion of cyberspace as a utopian web of fluid identities and unlimited possibilities).

7 On the positive side, new technologies can help build racial solidarity (and transracial solidarity) and offer the freedom to construct and perform online identities across various intersections as well as build online communities.

8 The scholarship concerning questions of accountability and fairness in automated decision-making is vast. For a helpful review, see, e.g., Margot Kaminski, Binary Governance: A Two-Part Approach to Accountable Algorithms 2, n.4 (draft on file with author).

9 See, e.g., Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (2018); Pasquale, supra note 4; Solon Barocas & Andrew Selbst, Big Data's Disparate Impact, 104 Calif. L. Rev. 671 (2016); Anupam Chander, The Racist Algorithm, 115 Mich. L. Rev. 1023 (2017).

10 Catherine Powell, Gender Indicators as Global Governance: Not Your Father's World Bank, 17 Geo. J. Gender & L. 777 (2016).

11 UDHR, supra note 3, art. 2 (emphasis added). See also id. art. 1 (providing “[a]ll human beings are born free and equal in dignity and rights”).

12 See, e.g., Saranya Vijayakumar, Algorithmic Decision-Making, Harv. Pol. Rev. (June 28, 2017).

13 This phrase began as a caption to a New Yorker cartoon in 1993. Peter Steiner, On the Internet, Nobody Knows You're a Dog, New Yorker (July 5, 1993). See also Glenn Fleishman, Cartoon Captures Spirit of the Internet, N.Y. Times (Dec. 14, 2000) (explaining the creation and the cultural relevance of the cartoon after its publication).

14 See, e.g., Elise Boddie, Racial Territoriality, 58 UCLA L. Rev. 401 (2010).

15 See, e.g., Danielle Keats Citron, Hate Crimes in Cyberspace (2014); Mary Anne Franks, Unwilling Avatars: Idealism and Discrimination in Cyberspace, 20 Colum. J. Gender 224 (2011).

16 Khiara Bridges, The Poverty of Privacy Rights (2018); Danielle Keats Citron, A Poor Mother's Right to Privacy: A Review, 98 B.U. L. Rev. 1, 4 (2018) (noting that “the constitutional right to information privacy should be understood to limit the state's collection of personal data from poor mothers”).

18 Cf. Anita Allen, Unpopular Privacy: What Must We Hide? (2011) (discussing a range of new privacy concerns, including data and racial privacy).

19 Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2018).

20 Nicholas Confessore, The Unlikely Activists Who Took On Silicon Valley — and Won, N.Y. Times (Aug. 14, 2018) (discussing activism before and after the Cambridge Analytica revelations); Bruce Sterling, Shoshanna Zuboff Condemning Google “Surveillance Capitalism”, Wired (Mar. 8, 2016).

21 Shoshana Zuboff, Big Other: Surveillance Capitalism and the Prospects of an Information Civilization, 30 J. Info. Tech. 75, 81 (2015) (describing “new market form that is a radically disembedded and extractive variant of information capitalism, one that can be identified as surveillance capitalism”).

22 Citizen Ex, Algorithmic Citizenship (proposing the notion of algorithmic citizenship, based on how one appears on the internet—as a collection of data, extending across several countries—as “a new form of citizenship, one where your citizenship, and therefore both your allegiances and your rights, are constantly being questioned, calculated and rewritten”).

23 Atossa Araxia Abrahamian, Data Subjects of the World, Unite!, N.Y. Times (May 28, 2018) (noting that the EU General Data Protection Regulation imbues us as data subjects with new rights).

24 A Virtual Private Network can be used “to send your data via another country, from where you can watch … videos or download all the files via another country.” Citizen Ex, supra note 22.

25 Obasogie, supra note 2.

26 Id. at 87. See also Solangel Maldonado, Racial Hierarchy and Desire: How Law’s Influence on Interracial Intimacies Perpetuates Inequality (manuscript on file with author) (discussing segregation in dating markets on and offline).

27 Obasogie, supra note 2, at 81–83.

28 Id. at 83.

29 Id. at 84.

30 Id. at 81 (emphasis in original).

31 See also Harvard Implicit Bias Project. Consider also the spate of stories on driving, napping, visiting Starbucks, and living while Black. See, e.g., Desire Thompson, Oblivious Person Calls 911 on “Suspicious” Black Cop & Other #LivingWhileBlack Stories, Vibe (Apr. 24, 2018).

32 See, e.g., Kevin Anthony Hoff & Masooda Bashir, Trust in Automation: Integrating Empirical Evidence on Factors That Influence Trust, 57 Hum. Factors 407 (2015).

34 Julie Angwin & Terry Parris, Jr., Facebook Lets Advertisers Exclude Users by Race, ProPublica (Oct. 28, 2016). In addition to the data collected on the Facebook platform itself, the tech giant also collects information from other websites (with Facebook sharing buttons) as well as from Instagram and WhatsApp accounts (both of which Facebook owns). Julie Angwin et al., What Facebook Knows About You, ProPublica (Sept. 18, 2016).

36 Brakkton Booker, HUD Hits Facebook for Allowing Housing Discrimination, NPR (Aug. 19, 2018) (describing how “Facebook permitted advertisers to discriminate based on disability by blocking ads to users the company categorized as having interests in ‘mobility scooter’ or ‘deaf culture’”). See also Olivier Sylvain, Intermediary Design Duties, 50 Conn. L. Rev. 1 (2018) (discussing liabilities for intermediary platforms, such as Facebook, under the Communications Decency Act).

37 Jeremy Quittner, Airbnb and Discrimination: Why It's All So Confusing, Fortune (June 23, 2016).

38 Id.

39 Id.

40 LaTanya Sweeney, Racism is Poisoning Online Ad Delivery, Says Harvard Professor, MIT Tech. Rev. (Feb. 4, 2013). See also Nancy Leong & Aaron Belzer, The New Public Accommodations: Race Discrimination in the Platform Economy, 105 Geo. L.J. 1271, 1293–95 (2017) (discussing how the guest-rating systems on platforms such as Airbnb and Uber entrench discrimination).

41 In a separate project, I am exploring linkages between privacy-based and equality-based responses, applying Kenji Yoshino's insights on this to the online context. Kenji Yoshino, The New Equal Protection, 124 Harv. L. Rev. 747, 748-50 (2011) (noting that “constitutional equality and liberty claims are often intertwined” yet liberty themes have found broader acceptance in the Supreme Court).

* I would like to thank the participants in the International Law and Race Workshop at University of Colorado Law School, August 3-4, 2018, as well as Danielle Citron, Clare Huntington, Margot Kaminski, Craig Konneth, Olivier Sylvain, and Ari Ezra Waldman for valuable input. I'm also grateful to my superb research assistants, Mary Katherine Cunningham, Christine Gabrellian, Aryian Kohandel-Shirazi, and Mindy Nam.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

AJIL Unbound
  • ISSN: -
  • EISSN: 2398-7723
  • URL: /core/journals/american-journal-of-international-law
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed