Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-fbnjt Total loading time: 0 Render date: 2024-10-30T03:58:30.778Z Has data issue: false hasContentIssue false

eleven - Should AI be regulated? Co-authored with Oren Etzioni

Published online by Cambridge University Press:  21 April 2023

Amitai Etzioni
Affiliation:
George Washington University, Washington DC
Get access

Summary

Policy makers and academics are raising more and more questions about the ways the legal and moral order can accommodate a large and growing number of machines, robots, and instruments equipped with artificial intelligence (AI)—hereinafter referred to as “smart instruments.” Many of these questions spring from the fact that smart instruments, such as driverless cars, have a measure of autonomy; they make many decisions on their own, well beyond the guidelines their programmers provided. Moreover, these smart instruments make decisions in very opaque ways, and they are learning instruments with guidance systems that change as they carry out their missions.

For example, a California policeman issued a warning to the passenger of a Google self-driving car because the car impeded traffic by driving traveling too slowly.But whom should the policeman have cited? The passenger? The owner? The programmer? The car’s computer? Similarly, Google faced charges that its search engine discriminated against women by showing ads for well-paying jobs to men more frequently than to women,and that it favored its own shops in search results.The inability of mere mortals to trace how such biases come about illustrates the challenges that smart machines pose to the legal and moral order. The same questions apply to findings that ads for websites providing arrest records were “significantly more likely to show up on searches for distinctively black names or a historically black fraternity.”Was there intent? Who or what should be held liable for the resulting harm? How can the government deter repeat offenses by the same instruments? This chapter provides a preliminary response to these and several related questions both in cases of limited harm (e.g., a program that causes a driverless car to crash into another) and with regard to greater potential harm (e.g., the fear that smart instruments may rebel against their makers and harm mankind).

This chapter focuses on the relationship between AI and the legal order; the relationship between AI and the moral order requires a separate analysis.Although both the legal and moral orders reflect the values of one and the same society, the chapter treats them separately because they choose and enforce values in different ways. In the legal realm, long-established institutions such as the legislature and courts sort out which values to enforce, but there are no such authoritative institutions in the social and moral realm.

Type
Chapter
Information
Law and Society in a Populist Age
Balancing Individual Rights and the Common Good
, pp. 171 - 180
Publisher: Bristol University Press
Print publication year: 2018

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×