Book contents
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- PART III ISSUES CONCERNING MACHINE ETHICS
- Introduction
- 6 What Matters to a Machine?
- 7 Machine Ethics and the Idea of a More-Than-Human Moral World
- 8 On Computable Morality
- 9 When Is a Robot a Moral Agent?
- 10 Philosophical Concerns with Machine Ethics
- 11 Computer Systems
- 12 On the Morality of Artificial Agents
- 13 Legal Rights for Machines
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
12 - On the Morality of Artificial Agents
from PART III - ISSUES CONCERNING MACHINE ETHICS
Published online by Cambridge University Press: 01 June 2011
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- PART III ISSUES CONCERNING MACHINE ETHICS
- Introduction
- 6 What Matters to a Machine?
- 7 Machine Ethics and the Idea of a More-Than-Human Moral World
- 8 On Computable Morality
- 9 When Is a Robot a Moral Agent?
- 10 Philosophical Concerns with Machine Ethics
- 11 Computer Systems
- 12 On the Morality of Artificial Agents
- 13 Legal Rights for Machines
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
Summary
Introduction: Standard versus Nonstandard Theories of Agents and Patients
Moral situations commonly involve agents and patients. let us define the class A of moral agents as the class of all entities that can in principle qualify as sources or senders of moral action, and the class P of moral patients as the class of all entities that can in principle qualify as receivers of moral action. A particularly apt way to introduce the topic of this paper is to consider how ethical theories (macroethics) interpret the logical relation between those two classes. There can be five logical relations between A and P; see Figure 12.1.
It is possible, but utterly unrealistic, that A and P are disjoint (alternative 5). On the other hand, P can be a proper subset of A (alternative 3), or A and P can intersect each other (alternative 4). These two alternatives are only slightly more promising because they both require at least one moral agent that in principle could not qualify as a moral patient. Now this pure agent would be some sort of supernatural entity that, like Aristotle's God, affects the world but can never be affected by it. Yet being in principle “unaffectable” and irrelevant in the moral game, it is unclear what kind of role this entity would exercise with respect to the normative guidance of human actions.
- Type
- Chapter
- Information
- Machine Ethics , pp. 184 - 212Publisher: Cambridge University PressPrint publication year: 2011
- 11
- Cited by