eleven - Should AI be regulated? Co-authored with Oren Etzioni
Published online by Cambridge University Press: 21 April 2023
Summary
Policy makers and academics are raising more and more questions about the ways the legal and moral order can accommodate a large and growing number of machines, robots, and instruments equipped with artificial intelligence (AI)—hereinafter referred to as “smart instruments.” Many of these questions spring from the fact that smart instruments, such as driverless cars, have a measure of autonomy; they make many decisions on their own, well beyond the guidelines their programmers provided. Moreover, these smart instruments make decisions in very opaque ways, and they are learning instruments with guidance systems that change as they carry out their missions.
For example, a California policeman issued a warning to the passenger of a Google self-driving car because the car impeded traffic by driving traveling too slowly.But whom should the policeman have cited? The passenger? The owner? The programmer? The car’s computer? Similarly, Google faced charges that its search engine discriminated against women by showing ads for well-paying jobs to men more frequently than to women,and that it favored its own shops in search results.The inability of mere mortals to trace how such biases come about illustrates the challenges that smart machines pose to the legal and moral order. The same questions apply to findings that ads for websites providing arrest records were “significantly more likely to show up on searches for distinctively black names or a historically black fraternity.”Was there intent? Who or what should be held liable for the resulting harm? How can the government deter repeat offenses by the same instruments? This chapter provides a preliminary response to these and several related questions both in cases of limited harm (e.g., a program that causes a driverless car to crash into another) and with regard to greater potential harm (e.g., the fear that smart instruments may rebel against their makers and harm mankind).
This chapter focuses on the relationship between AI and the legal order; the relationship between AI and the moral order requires a separate analysis.Although both the legal and moral orders reflect the values of one and the same society, the chapter treats them separately because they choose and enforce values in different ways. In the legal realm, long-established institutions such as the legislature and courts sort out which values to enforce, but there are no such authoritative institutions in the social and moral realm.
- Type
- Chapter
- Information
- Law and Society in a Populist AgeBalancing Individual Rights and the Common Good, pp. 171 - 180Publisher: Bristol University PressPrint publication year: 2018