Skip to main content Accessibility help
×
    • You have access
    • Open access
Publisher:
Cambridge University Press
Online publication date:
September 2025
Print publication year:
2025
Online ISBN:
9781009627351
Creative Commons:
Creative Common License - CC Creative Common License - BY Creative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/creativelicenses

Book description

As managers digitize judgment using AI, their evaluations of persons risk imposing benefits and burdens in opaque and unaccountable ways. A wide range of harms may occur when access to one's personal data (and meaningful information about its use) is denied. Key data access rights and AI explainability guarantees in US. and EU law are designed to ameliorate the harms caused by irresponsible digitization, but their definition and range of application is contested. A robust policy evaluation framework will be needed to inform the proper level and scope of information access, as regulators clarify the contours of such rights and guarantees. By revealing the stakes of data access, this Element offers a useful evaluative framework for those interpreting and applying laws of data protection and AI explainability. This title is also available as Open Access on Cambridge Core.

References

Ackerman, F. & Heinzerling, L. (2004). Priceless: On Knowing the Price of Everything and the Value of Nothing, New York: New Press.
ACM Conference (2024). ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT), https://facctconference.org/.
Ada Lovelace Institute (2022). Algorithmic Impact Assessment: A Case Study in Healthcare, London: Ada Lovelace Institute.
Adams-Prassl, J., Binns, R., & Kelly-Lyth, A. (2023). Directly Discriminatory Algorithms. Modern Law Review, 86(1), 144175.
Agrawal, A., Gans, J., & Goldfarb, A. (2018). Prediction Machines: The Simple Economics of Artificial Intelligence, Boston: Harvard Business Review Press.
AI Now Institutute, NYU Law’s Center on Race, Inequality, and the Law, and the Electronic Frontier Foundation (2018). Litigating Algorithms: Challenging Government Use of Algorithmic Decision Systems, New York: AI Now Institute.
Ajunwa, I. (2020a). The Black Box at Work. Big Data & Society, 7(2), 16.
Ajunwa, I. (2020b). The Paradox of Automation as Anti-Bias Intervention. Cardozo Law Review, 41(5), 16711742.
Ajunwa, I. (2021). An Auditing Imperative for Automated Hiring Systems. Harvard Journal of Law & Technology, 34(2), 621700.
Albers, M. (2014). Realizing the Complexity of Data Protection. In Serge Gutwirth, S., Leenes, R., and De Hert, P., eds., Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges, New York: Springer, 213235.
Allen, A. L. (2022). Dismantling the ‘Black Opticon’: Privacy, Race Equity, and Online Data-Protection Reform, New Haven: Yale Law Journal Forum.
American Civil Liberties Union, the Leadership Conference on Civil and Human Rights, Upturn, et al. (2021). Letter to the White House OSTP on Centering Civil Rights in AI Policy, www.aclu.org/sites/default/files/field_document/2021-07-13_letter_to_white_house_ostp_on_centering_civil_rights_in_ai_policy_1.pdf.
Ammermann, S. (2013). Adverse Action Notice Requirements Under the ECOA and the FCRA. https://consumercomplianceoutlook.org/2013/second-quarter/adverse-action-notice-requirements-under-ecoa-fcra/.
Anderson, E. (2017). Private Government: How Employers Rule Our Lives (and Why We Don’t talk about It), Princeton: Princeton University Press.
Andersson, J. (2018). The Future of the World Futurology, Futurists, and the Struggle for the Post-Cold War Imagination, Oxford: Oxford University Press.
Angwin, J., Scheiber, N., & Tobin, A. (2017). Facebook Job Ads Raise Concerns About Age Discrimination, www.nytimes.com/2017/12/20/business/facebook-job-ads.html.
Aniceto, M. C., Barboza, F., & Kimura, H. (2020). Machine Learning Predictivity Applied to Consumer Creditworthiness. Future Business Journal, 6(1), 114.
Arbel, Y. A. & Shapira, R. (2020). Theory of the Nudnik: The Future of Consumer Activism and What We Can Do to Stop It. Vanderbilt Law Review, 73(4), 929988.
Attorney-General’s Department (2023). Privacy Act Review Report, Australian Government Attorney-General’s Department.
Ausloos, J. (2018). Paul-Olivier Dehaye and the Raiders of the Lost Data, www.law.kuleuven.be/citip/blog/paul-olivier-dehaye-and-the-raiders-of-the-lost-data/.
Ausloos, J. (2020). The Right to Erasure in EU Data Protection Law: from Individual Rights to Effective Oxford: Oxford University Press.
Ausloos, J. & Dewitte, P. (2018). Shattering One-Way Mirrors: Data Subject Access Rights in Practice. International Data Privacy Law, 8(1), 428.
Ausloos, J., Mahieu, R., & Veale, M. (2019). Getting Data Subject Rights Right. Journal of Intellectual Property, Information Technology and Electronic Commerce Law, 10(3), 283309.
Balkin, J. M. (2016). Information Fiduciaries and the First Amendment. U.C. Davis Law Review, 49(4), 11831234.
Bar-Gill, O. (2019). Algorithmic Price Discrimination When Demand Is a Function of Both Preferences and (Mis)perceptions. University of Chicago Law Review, 86(2), 217254.
Beckert., J. (2017). Imagined Futures: Fictional Expectations and Capitalist Dynamics, Cambridge, MA: Harvard University Press.
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? Association for Computing Machinery, 610623. https://dl.acm.org/doi/10.1145/3442188.3445922
Benjamin, R. (2019). Race after Technology: Abolitionist Tools for the New Jim Code, Cambridge: Polity.
Ben-Shahar, O. (2019). Data Pollution. Journal of Legal Analysis, 11(1), 104159.
Ben-Shahar, O. & Porat, A. (2021). Personalized Law: Different Rules for Different People, New York: Oxford University Press.
Ben-Shahar, O. & Schneider, C. E. (2014). More Than You Wanted to Know: The Failure of Mandated Disclosure, Princeton, NJ: Princeton University Press.
Berman., E. P. (2022). Thinking like an Economist: How Efficiency Replaced Equality in U.S. Public Policy, Princeton, NJ: Princeton University Press.
Biden, J. R. (2023). Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, Washington, D.C.: The White House.
Bietti, E. (2025). Data is Infrastructure. Theoretical Inquiries in Law, 26, 5587.
Birhane, A., Kahembwe, E. & Prabhu, V. U. (2021). Multimodal Datasets: Misogyny, Pornography, and Malignant Stereotypes, https://arxiv.org/abs/2110.01963.
Blass, J. (2019). Algorithmic Advertising Discrimination. Northwestern University Law Review, 114(2), 415468.
Bolder, D. J. (2018). Credit-Risk Modelling: Theoretical Foundations, Diagnostic Tools, Practical Examples, and Numerical Recipes in Python, Cham: Springer.
Boniface, C., Bielova, N., Fouad, I., Lauradoux, C., & Santos, C. (2019). Security Analysis of Subject Access Request Procedures: How to Authenticate Data Subjects Safely When They Request for Their Data. In Naldi, M., Bourka, A. & Italiano, F. G. et al., eds., Privacy Technologies and Policy: 7th Annual Privacy Forum, APF 2019 Rome, Italy, June 13–14, 2019 Proceedings. Switzerland AG: Springer, 182210.
Boyle, J. (2007). Cultural Environmentalism and beyond. Law and Contemporary Problems, 70(2), 522.
Broussard, M. (2018). Artificial Unintelligence: How Computers Misunderstand the World, Cambridge, MA: The MIT Press.
Bruckner, M. A. (2018). The Promise and Perils of Algorithmic Lenders’ Use of Big Data. Chicago-Kent Law Review, 93(1), 360.
California Privacy Protection Agency (2021). Invitation for Preliminary Comments On Proposed Rulemaking under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/invitation_for_comments.pdf.
Calo, R. (2012). Against Notice Skepticism in Privacy (and Elsewhere). Notre Dame Law Review, 87(3), 10271072.
Calo, R. & Rosenblat, A. (2017). The Taking Economy: Uber, Information, and Power. Columbia Law Review, 117(6), 16231690.
Center for Democracy and Technology (2021). Preliminary Comments on Proposed Rulemaking under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_4.pdf.
Chamayou, G. (2021). The Ungovernable Society: A Genealogy of Authoritarian Liberalism. English Edition. Medford, MA: Polity Press.
Chermack, T. J., Lyhnam, S., & Ruona, W. E. A. (2001). A Review of Scenario Planning Literature. Future Research Quarterly, 17, 731.
Christl, W. (2017). How Companies Use Personal Data Against People, Berlin: Cracked Labs.
Christl, W. & Spiekermann, S. (2016). Networks of Control: A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy, Berlin: Facultas Cracked Labs.
Church, D., Pullen, M., & Winn, J. K. (1999). Recent Developments Regarding U.S. and EU Regulation of Electronic Commerce. International Lawyer (ABA), 33(2), 347366.
Citron, D. K. (2010). Civil Rights in Our Information Age: in The Offensive Internet: Speech, Privacy and Reputation. Cambridge, MA: Harvard University Press.
Cohen, J. E. (2013). What Privacy is for. Harvard Law Review, 126(7), 19041933.
Collier, R. B., Carter, C., & Dubal, V. (2017). Labor Platforms and Gig Work: The Failure to Regulate, eScholarship, Berkeley, CA: University of California.
Consumer Financial Protection Bureau (n.d.). Appendix C to Part 1002 – Sample Notification Forms, www.consumerfinance.gov/rules-policy/regulations/1002/c/.
Consumer Reports (2021). Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_4.pdf.
Costanza-Chock, S. (2020). Design Justice: Community-Led Practices to Build the Worlds We Need, Cambridge, MA: The MIT Press.
Dastin, J. (2018). Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Wome, www.reuters.com/article/idUSKCN1MK0AG/.
DataGrail (2019). The Age of Privacy: The Cost of Continuous Compliance: Benchmarking the Ongoing Operational Impact of GDPR & CCPA, www.datagrail.io/resources/reports/gdpr-ccpa-cost-report/.
Division of Financial Practices (2021). Letter to Office of Fair Lending & Equal Opportunity Bureau of Consumer Financial Protection, www.ftc.gov/system/files/documents/reports/ftc-enforcement-activities-under-ecoa-regulation-b-report-cfpb/p154802cfpbecoareport.pdf.
Dixon, P. & Gellman, R. (2014). The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future, San Diego, CA: World Privacy Forum.
Doerr, M., Suver, C., & Wilbanks, J. (2016). Developing a Transparent, Participant-Navigated Electronic Informed Consent for Mobile-Mediated Research, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769129.
DPcuria.eu (n.d.). Referral C-203/22 (Dun & Bradstreet Austria, 16 Mar 2022), www.dpcuria.eu/case?reference=C-203/22.
Draper, N. & Turow, J. (2019). The Corporate Cultivation of Digital Resignation. New Media and Societ, 21(8), 18241839.
Dubal, V. (2024). On Algorithmic Wage Discrimination. Columbia Law Review, 123(7), 19291992.
Dubal, V. B. (2017). Wage Slave or Entrepreneur: Contesting the Dualism of Legal Worker Identities. California Law Review, 105(1), 65124.
Dyal-Chand, R. (2021). Autocorrecting for Whiteness. Boston University Law Review, 101(1), 191286.
Ebeling, M. F. E. (2016). Healthcare and Big Data: Digital Specters and Phantom Objects, New York: Palgrave Macmillan.
Ebeling, M. F. E. (2018). Uncanny Commodities: Policy and Compliance Implications for the Trade in Debt and Health Data. Annals of Health Law, 27(2), 125148.
Edwards, L. & Veale, M. (2017). Slave to the Algorithm? Why a “Right to an Explanation” Is Probably Not the Remedy You Are Looking for. Duke Law & Technology Review, 16(1), 1884.
EFF and ACLU (2021). Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_1.pdf.
Epstein, A. (2015). Facebook’s New Patent Lets Lenders Reject a Loan Based on Your Friends’ Credit Scores – But Don’t Freak out, https://qz.com/472751/facebooks-new-patent-lets-lenders-reject-a-loan-based-on-your-friends-credit-scores-but-dont-freak-out.
Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, New York: St. Martin’s Press.
European Commission (2012). Commission Staff Working Paper Impact Assessment Accompanying the document, www.europarl.europa.eu/cmsdata/59702/att_20130508ATT65856-1873079025799224642.pdf.
European Commission (2017). Article 29 Data Protection Working Party, Guidelines on Automated Individual Decision-Making And Profiling for the Purposes of Regulation 2016/679, https://ec.europa.eu/newsroom/document.cfm?doc_id=47742.
European Data Protection Board (2022). Guidelines 01/2022 on Data Subject rights – Right of Access, https://edpb.europa.eu/our-work-tools/documents/public-consultations/2022/guidelines-012022-data-subject-rights-right_en.
Evans, W. (2021). Inside Amazon’s Failures to Protect Your Data: Internal Voyeurs, Bribery Scandals and Backdoor Schemes, https://revealnews.org/article/inside-amazons-failures-to-protect-your-data-internal-voyeurs-bribery-schemes-and-backdoor-access/.
Eveleth, R. (2019). Credit Scores Could Soon Get Eve Creepier and More Biased, www.vice.com/en/article/zmpgp9/credit-scores-could-soon-get-even-creepier-and-more-biased.
Eykholt, K., Evtimov, I., Fernandes, E. et al. (2018). Robust Physical-World Attacks on Deep Learning Visual Classification. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition CVPR Computer Vision and Pattern Recognition (CVPR), 2018 IEEE/CVF Conference, 16251634.
Facebook (2011). Letter from Facebook User Operations–Data Access Request Team, to Max Schrem, www.europe-v-facebook.org/FB_E-Mails_28_9_11.pdf.
Federal Trade Commission (2000). Privacy Online: Fair Information Practices in the Electronic Marketplace: A Federal Trade Commission Report to Congress, Washington, D.C.: Federal Trade Commission.
Federal Trade Commission (2013). In FTC Study, Five Percent of Consumers Had Errors on Their Credit Reports That Could Result in Less Favorable Terms for Loans, www.ftc.gov/news-events/news/press-releases/2013/02/ftc-study-five-percent-consumers-had-errors-their-credit-reports-could-result-less-favorable-terms.
Fingleton, E. (1999). In Praise of Hard Industries: Why Manufacturing, Not the Information Economy, Is the Key to Future Prosperity, Boston, MA: Houghton Mifflin.
Fleischer, R. S. (2020). Bias In, Bias Out: Why Legislation Placing Requirements on the Procurement of Commercialized Facial Recognition Technology Must Be Passed to Protect People of Color. Public Contract Law Journal, 50(1), 6389.
Flor, A. (2021). The Impact of Schrems II: Next Steps for U.S. Data Privacy Law. Notre Dame Law Review, 96(5), 20352058.
Foohey, P. & Sternberg Greene, S. (2021). Credit Scoring Duality. Law & Contemporary Problems, 85, 101121.
Fourcade, F. & Healy, K. (2024). The Ordinal Society. Cambridge, MA: Harvard University Press.
Fowler, G. A. (2020). Don’t Sell My Data! We Finally Have a Law for That, www.washingtonpost.com/technology/2020/02/06/ccpa-faq/.
Franks, M. A. (2019). The Cult of the Constitution. Stanford, CA: Stanford University Press.
Fruchte, J. (2018). Cost of GDPR Compliance for a Small Software Business, https://medium.com/expected-behavior/cost-of-gdpr-compliance-for-a-small-software-business-eb2b8b8e829.
Gal, M. & Aviv, O. (2020). The Competitive Effects of the GDPR. Journal of Competition Law and Economics, 16(3), 349391.
Gellert, R., Bekkum, M. V. , & Borgesius, F. Z. (2021). The Ola & Uber Judgments: for the First Time a Court Recognises a GDPR Right to an Explanation for Algorithmic Decision-Making, http://eulawanalysis.blogspot.com/2021/04/the-ola-uber-judgments-for-first-time.html.
Gershgorn, D. (2018). If AI Is Going to be the World’s Doctor, It Needs Better Textbooks, https://qz.com/1367177/if-ai-is-going-to-be-the-worlds-doctor-it-needs-better-textbooks.
Gibbs, S. (2015). Women Less Likely to be Shown Ads for High-Paid Jobs on Google, Study Shows, www.theguardian.com/technology/2015/jul/08/women-less-likely-ads-high-paid-jobs-google-study.
Gilbert, T. K., Dean, S., Zick, T., & Lambert, N. (2022). Choices, Risks, and Reward Reports: Charting Public Policy for Reinforcement Learning Systems, CLTC Centre for Long-Term Cybersecutiry, University of California, Berkeley.
Google (2021). Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_1.pdf.
Grimmelmann, J. & Westreich, D. (2017). Incomprehensible Discrimination. California Law Review Online, 7, 164177.
Grochowski, M. (2024). Digital Vulnerability in a Post-Consumer Society. Subverting Paradigms? In Crea, Camilla and De Franceschi, Alberto, eds., Digital Vulnerability in European Private Law. Nomos, Germany, 221225.
Gunter, K. G. (2000). Computerized Credit Scoring’s Effect on the Lending Industry. North Carolina Banking Institute, 4, 443474.
Guzelian, C. P. (2008). Scientific Speech. Iowa Law Review, 93(3), 881928.
Hale., R. L. (1952). Freedom through Law: Public Control of Private Governing Power, New York: Columbia University Press.
Hartzog, W. & Richards, N. (2020). Privacy’s Constitutional Moment and the Limits of Data Protection. Boston College Law Review, 61(5), 16871762.
Havard, C. J. (2011). On the Take: The Black Box of Credit Scoring and Mortgage Discrimination. Boston University Public Interest Law Journal, 20(2), 241288.
Hernandez, G. A., Eddy, K. J., & Muchmore, J. (2001). Insurance Weblining and Unfair Discrimination in Cyberspace. SMU Law Review, 54(4), 19531972.
Hildebrandt, M. (2012). The Dawn of a Critical Transparency Right for the Profiling Era. Digital Enlightenment Yearbook 2012, Amsterdam: IOS Press.
Hiller, J. S. & Jones, L. S. (2022). Who’s Keeping Score? Oversight of Changing Consumer Credit Infrastructure. American Business Law Journal, 59(1), 61122.
Hirsch, D. D. (2006). Protecting the Inner Environment: What Privacy Regulation Can Learn from Environmental Law. Georgia Law Review, 41(1), 164.
Hoffman, S. (2018). Big Data’s New Discrimination Threats: Amending the Americans with Disabilities Act to Cover Discrimination Based on Data-Driven Predictions of Future Diseas. In Cohen, I. Glenn, Lynch, H. F., Vayena, E., and Gasser, U., eds., Big Data, Health Law, and Bioethics, Cambridge: Cambridge University Press, 8597.
Hoffman, S. & Podgurski, A. (2020). Artificial Intelligence and Discrimination in Health Care. Yale Journal of Health Policy, Law and Ethics, 19(3), 149.
Hurley, M. & Adebayo, J. (2016). Credit Scoring in the Era of Big Data. Yale Journal of Law & Technology, 18(2), 148216.
Iafrati, R. (2019). Can the CCPA Access Right Be Saved? Realigning Incentives in Access Request Verification. Pittsburgh Journal of Technology Law and Policy, 20, 2542.
Jee, C. (2019). A Biased Medical Algorithm Favored White People for Health-Care Programs, www.technologyreview.com/2019/10/25/132184/a-biased-medical-algorithm-favored-white-people-for-healthcare-programs/.
Kahneman, D., Sibony, O., & Sunstein, C. R. (2021). Noise: A Flaw in Human Judgment, New York: William Collins; Little, Brown Spark.
Kaminski, M. E. (2019a). Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability. Southern California Law Review, 92(6), 15291616.
Kaminski, M. E. (2019b). The Right to Explanation, Explained. Berkeley Technology Law Journal, 34(1), 189218.
Kaminski, M. E. & Malgieri, G. (2020). Multi-layered Explanations from Algorithmic Impact Assessments in the GDPR at “FAT* ‘20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency,” Association for Computing Machinery, 6879.
Kang, Y. (2019). Secret Science and the Environmental Protection Agency’s Postmodern Attack on Agency Decision-Making. Journal of Land Use & Environmental Law, 35(1), 6990.
Kapczynski, A. (2008). The Access to Knowledge Mobilization and the New Politics of Intellectual Property. Yale Law Journal, 117(5), 804885.
Katyal, S. K. (2019). Private Accountability in the Age of Artificial Intelligence. UCLA Law Review, 66(1), 54141.
Kimberley, B. & Zetoony, D. (2021). The Brave New World of Data Privacy: Benchmarking Corporate Compliance, https://docket.acc.com/brave-new-world-data-privacy-benchmarking-corporate-compliance.
Kysar., D. A. (2010). Regulating from Nowhere: Environmental Law and the Search for Objectivity, New Haven, CT: Yale University Press.
Lacko, M. V. (2004). The Data Quality Act: Prologue to a Farce or a Tragedy. Emory Law Journal, 53(1), 305358.
Lauer, J. (2017). Creditworthy: A History of Consumer Surveillance and Financial Identity in America, New York: Columbia University Press.
Layton, R. (2019). The 10 Problems of the GDPR: The US can Learn from the EU’s Mistakes and Leapfrog Its Policy, www.judiciary.senate.gov/imo/media/doc/Layton%20Testimony.pdf.
Lee, K.-F. & Chen, Q.-F. (2021). AI 2041, New York: Currency.
Lehr, D. & Ohm, P. (2017). Playing with the Data: What Legal Scholars Should Learn about Machine Learning. U.C. Davis Law Review, 51(2), 653718.
Federal Trade Commission. (2012). Report to Congress under Section 319 of the Fair and Accurate Credit Transactions Act of 2003, Washington, D.C.
Linardatos, P., Kotsiantis, S., & Papastefanopoulos, V. (2021). Explainable AI: A Review of Machine Learning Interpretability Methods. Entropy, 23(1), 145.
Lou, Y., Caruana, R., Gehrke, J., & Hooker, G. (2013). Accurate Intelligible Models with Pairwise Interactions at “KDD ‘13: Proceedings of the 19th ACM SIGKDD InternationalCconference on Knowledge Discovery and Data Mining,” Association for Computing Machinery, 623631.
Mahieu, R. L. P. & Ausloos, J. (2020). Harnessing the Collective Potential of GDPR Access Rights: Towards an Ecology of Transparency, https://pure.uva.nl/ws/files/58628461/Harnessing_the_collective_potential_of_GDPR_access_rights_towards_an_ecology_of.pdf.
Mahieu, R. L. P. & Ausloos, J. (2020). Recognising and Enabling the Collective Dimension of the GDPR and the Right of Access, https://osf.io/preprints/lawarxiv/b5dwm.
Malgieri, G. & Comandé, G. (2017). Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulatio. International Data Privacy Law, 7(4), 243265.
Mandinaud, V. & Ponce Del Castillo, A. (2024). AI Systems, Risks and Working Conditions. In Ponce del Castillo, A., ed., Artificial Intelligence, Labour and Society. Brussels: ETUI Press, 237249.
Marcus, G. & Davis, E. (2019). Rebooting AI: Building Artificial Intelligence We can Trust, New York: Pantheon Books.
Marks, M. (2021). Emergent Medical Data: Health Information Inferred by Artificial Intelligence. UC Irvine Law Review, 11(4), 9951066.
Mashaw, J. L. (1981). Administrative Due Process: The Quest for a Dignitary Theory. Boston University Law Review, 61(4), 885931.
Masur, J. S. & Posner, E. A. (2016). Unquantified Benefits and the Problem of Regulation under Uncertainty. Cornell Law Review, 102(1), 87138.
Mathews, K. J. & Bowman, C. M. (2018). The California Consumer Privacy Act of 2018, https://privacylaw.proskauer.com/2018/07/articles/data-privacy-laws/the-california-consumer-privacy-act-of-2018.
McQuinn, A. & Castro, D. (2019). The Costs of an Unnecessarily Stringent Federal Data Privacy Law, https://itif.org/publications/2019/08/05/costs-unnecessarily-stringent-federal-data-privacy-law/.
Mittelstadt, B., Russell, C., & Wachter, S. (2019). Explaining Explanations in AI at “FAT* ‘19: Proceedings of the Conference on Fairness, Accountability, and Transparency,” Association for Computing Machinery, 279288.
Monticollo, A., Cividanes, E., & Reckell, C. (2020). California Privacy Landscape Changes Again with Approval of New Ballot Initiative. Antitrust Magazine, 35(1), 32.
Moralidad, G. (2021). Facebook Fired 52 Employees for Accessing User Data, Including Women They Like, www.latintimes.com/facebook-fired-52-employees-accessing-user-data-including-women-they-477809.
Naudts, L., Ausloos, J., & Dewitte, P. (2022). Meaningful Transparency through Data Rights: A Multidimensional Analysis. In Kosta, E., Leenes, R., and Kamara, I. Research Handbook on EU Data Protection Law, Northampton, MA: Edward Elgar, 530571.
Nehf, J. P. (2005). The Limits of Cost-Benefit Analysis in the Development of Database Privacy Policy in the United States. In Ramsay, I., ed., Risk and Choice in Consumer Society, Bruxelles: Ant. N. Sakkoulas; Bruylant, 6188.
Niebler, V. & Kern, A. (2020). Organizing YouTube: A Novel Case of Platform Worker Organizing, Berlin: Friedrich Ebert Stiftung.
Nissenbaum, H. (1996). Accountability in a Computerized Society. Science and Engineering Ethics, 2(1), 2542.
Nissenbaum, H. (2004). Privacy as Contextual Integrity. Washington Law Review, 79(1), 119158.
Obermeyer, Z., Mullainathan, S., Vogeli, C., & Powers, B. (2019). Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations. Science, 366(6464), 447453.
Office of Oversight and Investigations Majority Staff (2013). A Review of the Data Broker Industry: Collection, Use, and Sale of Consumer Data for Marketing Purposes, Washington, D.C.: United States Senate Committee on Commerce, Science, and Transportation.
Office of the Attorney General (2018). Standardized Regulatory Impact Assessment, U.S. Department of JUSTICE.
Olmstead, M. (2021). A Prominent Priest Was Outed for Using Grindr. Experts Say It’s a Warning Sign, https://slate.com/technology/2021/07/catholic-priest-grindr-data-privacy.html.
O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, New York: Crown.
Open Society Foundations (2019). Q&A: Fighting for Workers’ Right to Data, www.opensocietyfoundations.org/voices/q-and-a-fighting-for-workers-right-to-data.
Organisation for Economic Co-operation and Development (1980). OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.
Paleja, A. (2021). Google Fires 80 Employees for Exploiting User Data, https://interestingengineering.com/google-fires-80-employees-for-exploiting-user-data.
Pardau, S. L. (2018). The California Consumer Privacy Act: Towards a European-Style Privacy Regime in the United States. Journal of Technology Law & Policy, 23(1), 68114.
Pasquale, F. (2005). Toward an Ecology of Intellectual Property: Lessons from Environmental Economics for Valuing Copyright’s Commons. Yale Journal of Law & Technology, 8, 78135.
Pasquale, F. (2015). The Black Box Society: The Secret Algorithms that Control Money and Information, Cambridge, MA: Harvard University Press.
Pasquale, F. (2018). Law and Technology When Machine Learning is Facially Invalid. Communications of the ACM, 61(9), 2527.
Pasquale, F. (2020). New Laws of Robotics: Defending Human Expertise in the Age of AI. Cambridge, MA: The Belknap Press of Harvard University Press.
Pasquale, F. (2023). Power and Knowledge in Policy Evaluation: From Managing Budgets to Analyzing Scenarios. Law & Contemporary Problems, 86 (3), 3969.
Pasquale, F. (2024). Affective Computing at Work: Rationales for Regulating Emotion Attribution and Manipulation. In Ponce del Castillo, A., ed., Artificial Intelligence, Labour and Society. Brussels: ETUI Press, 175179.
Pasquale, F. & Citron, D. K. (2014). Promoting Innovation While Preventing Discrimination: Policy Goals for the Scored Society. Washington Law Review, 89(4), 14131424.
Pasquale, F. & Kiriakos, M. (2025). Contesting the Inevitability of Scoring: The Value(s) of Narrative in Consumer Credit Allocation. In Burchard, C. & Spiecker, I., eds., Algorithmic Transformations of Power: Between Trust, Conflict, and Uncertainty. Nomos/Hart, MI: forthcoming.
Paul, K. (2020). They Know Us Better Than We Know Ourselves’: How Amazon Tracked My Last Two Years of Reading, www.theguardian.com/technology/2020/feb/03/amazon-kindle-data-reading-tracking-privacy.
Paul, S. M. (2017). Uber as For-Profit Hiring Hall: A Price-Fixing Paradox and its Implications. Berkeley Journal of Employment and Labor Law, 38(2), 233263.
Pearl, J. & Mackenzie, D. (2019). The Book of Why: The New Science of Cause and Effect. London: Penguin Books.
Pettypiece, S. & Robertson, J. (2014a). How Big Data’s ‘Suffering Seniors’ List Peers into Medicine Chests, https://scnow.com/business/how-big-datas-suffering-seniors-list-peers-into-medicine-chests/article_87a6d21e-39df-11e4-81ea-001a4bcf6878.html.
Pettypiece, S. & Robertson, J. (2014b). Sick Elderly for Sale by Data Miners for 15 Cents a Name, www.bloomberg.com/news/articles/2014-09-11/how-big-data-peers-inside-your-medicine-chest.
Poon, M. (2009). From New Deal Institutions to Capital Markets: Commercial Consumer Risk Scores and the Making of Subprime Mortgage Finance. Accounting, Organizations and Society, 34(5), 654674.
Privacy International (2017). Case Study: Fintech and the Financial Exploitation of Customer Data, www.privacyinternational.org/case-studies/757/case-study-fintech-and-financial-exploitation-customer-data.
Re, R. M. & Solow-Niederman, A. (2019). Developing Artificially Intelligent Justice. Stanford Technology Law Review, 22(2), 242289.
Regalbuto, J. (2019). Use of External Consumer Data and Information Sources in Underwriting for Life Insurance, New York: New York State Department of Financial Services.
Revesz, R. L. (2020). Destabilizing Environmental Regulation: The Trump Administration’s Concerted Attack on Regulatory Analysis. Ecology Law Quarterly, 47(3), 887956.
Richards, N. (2022). Why Privacy Matters, New York: Oxford University Press.
Richards, N. & Hartzog, W. (2017). Trusting Big Data Research. DePaul Law Review, 66(2), 579590.
Richardson, R., Crawford, K., & Schult, J. (2019). Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice. New York University Law Review, 94, 192233.
Rieke, A., Bogen, M., & Robinson, D. G. (2018). Public Scrutiny of Automated Decisions: Early Lessons and Emerging Methods, Washington, D.C.: Omidyar Network and Upturn.
Rinehart, W. (2022). What is the Cost of Privacy Legislation? www.thecgo.org/benchmark/what-is-the-cost-of-privacy-legislation/.
Ringland, G. (1998). Scenario Planning Managing for the Future, 1st Illustrated Reprint ed., New York: John Wiley.
Mortgage, Rocket (2021). Preliminary Comments on Proposed Rulemaking under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_1.pdf.
Rostow, T. (2017). What Happens When an Acquaintance Buys Your Data: A New Privacy Harm in the Age of Data Broker. Yale Journal on Regulation, 34(2), 667708.
Safak, C. & Farrar, J. (2021). Managed by Bots: Data-Driven Exploitation in the Gig Economy, London: Business & Human Rights Resource Centre.
Sapir, J. (2022). Assessing the Russian and Chinese Economies Geostrategically. American Affairs Journal, VI (4), https://americanaffairsjournal.org/2022/11/assessing-the-russian-and-chinese-economies-geostrategically/.
Schneider, V. (2020). Locked out by Big Data: How Big Data Algorithms and Machine Learning May Undermine Housing Justice. Columbia Human Rights Law Review, 52(1), 251305.
Schüll, N. D. (2012). Addiction by Design: Machine Gambling in Las Vegas, Princeton, NJ: Princeton University Press.
Selbst, A. D. (2021). An Institutional View of Algorithmic Impact Assessments. Harvard Journal of Law & Technology, 35(1), 117191.
Selbst, A. & Powles, J. (2017). Meaningful Information and the Right to Explanation. International Data Privacy Law, 7(4), 233242.
Selbst, A. D. & Barocas, S. (2018). The Intuitive Appeal of Explainable Machines. Fordham Law Review, 87(3), 10851140.
Shindell, D. (2020). Health and Economic Benefits of a 2ºC Climate Policy, Testimony to the House Committee on Oversight and Reform Hearing on “The Devastating Impacts of Climate Change on Health,” https://nicholas.duke.edu/sites/default/files/documents/Shindell_Testimony_July2020_final.pdf.
Shyy, S. (2021). The GDPR’s Lose-Lose Dilemma: Minimal Benefits to Data Privacy & Significant Burdens on Business. UC Davis Business Law Journal, 20(2), 137188.
Siddiqi, N. (2006). Credit Risk Scorecards: Developing and Implementing Intelligent Credit Scoring. Hoboken, NJ: John Wiley & Sons.
Siddiqi, N. (2016). Intelligent Credit Scoring: Building and Implementing Better Credit Risk Scorecards. 2nd ed., Somerset, KY: John Wiley & Sons, Incorporated.
Simkovic, M. & Furth-Matzkin, M. (2022). Pigouvian Contracts. USC CLASS Research Paper.
Sinden, A. (2015). Formality and Informality in Cost-Benefit Analysis. Utah Law Review, 2015(1), 93172.
Smith, A. (2020). Using Artificial Intelligence and Algorithms, www.ftc.gov/business-guidance/blog/2020/04/using-artificial-intelligence-and-algorithms.
Solove, D. J. (2001). Privacy and Power: Computer Databases and Metaphors for Information Privacy. Stanford Law Review, 53(6), 13931462.
Solove, D. J. & Citron, D. K. (2018). Risk and Anxiety: A Theory of Data-Breach Harms. Texas Law Review, 96(4), 737786.
Spinney, L. (2022). Are We Witnessing the Dawn of Post-Theory Science? www.theguardian.com/technology/2022/jan/09/are-we-witnessing-the-dawn-of-post-theory-science.
Stanford University Human-Centered Artificial Intelligence (2021). Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act Of 2020, https://cppa.ca.gov/regulations/pdf/preliminary_rulemaking_comments_3.
Stone, D. (2011). Policy Paradox: The Art of Political Decision Making, 3rd ed., New York, London: W.W. Norton.
Sunstein, C. R. (2022). Governing by Algorithm? No Noise and (Potentially) Less Bias. Duke Law Journal, 71(6), 11751206.
Sutton, R. S. & Barto, A. G. (2019). Reinforcement Learning: An Introduction, 2nd ed., Cambridge, MA: MIT Press.
Terry, N. P. (2014). Big Data Proxies and Health Privacy Exceptionalism. Health Matrix: Journal of Law-Medicine, 24(6), 65108.
The Treasury Department, The Comptroller of the Currency, The Federal Reserve System, The Federal Deposit Insurance Corporation, The Consumer Financial Protection Bureau, and The National Credit Union Administration (2021). Request for Information and Comment on Financial Institutions’ Use of Artificial Intelligence, Including Machine Learning, Federal Register.
Thierer, A. (2013). A Framework for Benefit-Cost Analysis in Digital Privacy Debates. George Mason Law Review, 20(4), 10551106.
Tilly, C. (2006). Why? What Happens When People Give Reasons … And Why, Princeton, NJ: Princeton University Press.
Topol, E. (2019). Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again, New York: Basic Books.
Tufekci, Z. (2019). Think You’re Discreet Online? Think Again, www.nytimes.com/2019/04/21/opinion/computational-inference.html
U.S. Department of Health, Education & Welfare (1973). Records Computers and the Rights of Citizens, Cambridge, MA: The Massachusetts Institute of Technology.
Uršič, H. (2021). Data Subject Rights under the GDPR, Oxford: Oxford University Press.
Valentino-Devries, J., Singer-Vine, J., & Soltani, A. (2012). Websites Vary Prices, Deals Based on Users’ Information, www.wsj.com/articles/SB10001424127887323777204578189391813881534.
Vanderstichele, G. (2019). The Normative Value of Legal Analytics. Is There a Case for Statistical Precedent? https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3474878.
Wachter, S. (2022). The Theory of Artificial Immutability: Protecting Algorithmic Groups under Anti-Discrimination Law. Tulane Law Review, 97(2), 149204.
Wachter, S., Floridi, L., & Mittelstadt, B. (2017). Why a Right to Explanation of Automated Decision-Making Does not Exist in the General Data Protection Regulation. International Data Privacy Law, 7(2), 7699.
Walker, B. (2015). Benjamen Walker’s Theory of Everything, https://theoryofeverythingpodcast.com/series/instaserfs/.
Walzer, M. (1983). Spheres of Justice: A Defense of Pluralism and Equality, New York: Basic Books.
Warner, M. R. (2019). Senators Introduce Bipartisan Legislation to Ban Manipulative “Dark Patterns,” www.warner.senate.gov/public/index.cfm/2019/4/senators-introduce-bipartisan-legislation-to-ban-manipulative-dark-patterns.
Williams, D. (2021). Problem Solved? Is the Fintech Era Uprooting Decades Long Discriminatory Lending Practices?. Tulane Journal of Technology and Intellectual Property, 23, 159178.
Woods, A. K. (2022). Robophobia. University of Colorado Law Review, 93(1), 51114.
Worker Info Exchange (2021). Gig Workers Score Historic Digital Rights Victory against Uber & Ola, www.workerinfoexchange.org/post/gig-workers-score-historic-digital-rights-victory-against-uber-ola-2.
Worker Info Exchange (2023). Historic Digital Rights Win for WIE and the ADCU over Uber and Ola at Amsterdam Court of Appeal, www.workerinfoexchange.org/post/historic-digital-rights-win-for-wie-and-the-adcu-over-uber-and-ola-at-amsterdam-court-of-appeal, April 4.
Wu, X. & Zhang, X. (2016). Responses to Critiques of Machine Learning of Criminality Perceptions, https://arxiv.org/abs/1611.04135.
Zax, D. (2012). Fast Talk: How a Former Google Exec Plans to Transform Loans, www.fastcompany.com/1813256/fast-talk-how-former-google-exec-plans-transform-loans.

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Book summary page views

Total views: 0 *
Loading metrics...

* Views captured on Cambridge Core between #date#. This data will be updated every 24 hours.

Usage data cannot currently be displayed.

Accessibility standard: Missing or limited accessibility features

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

The PDF of this book is known to have missing or limited accessibility features. We may be reviewing its accessibility for future improvement, but final compliance is not yet assured and may be subject to legal exceptions. If you have any questions, please contact accessibility@cambridge.org.

Content Navigation
Table of contents navigation

Allows you to navigate directly to chapters, sections, or non‐text items through a linked table of contents, reducing the need for extensive scrolling.

Reading Order and Textual Equivalents
Single logical reading order

You will encounter all content (including footnotes, captions, etc.) in a clear, sequential flow, making it easier to follow with assistive tools like screen readers.

Short alternative textual descriptions

You get concise descriptions (for images, charts, or media clips), ensuring you do not miss crucial information when visual or audio elements are not accessible.

Visual Accessibility
Use of colour is not sole means of conveying information

You will still understand key ideas or prompts without relying solely on colour, which is especially helpful if you have colour vision deficiencies.

Structural and Technical Features
ARIA roles provided

You gain clarity from ARIA (Accessible Rich Internet Applications) roles and attributes, as they help assistive technologies interpret how each part of the content functions.