Skip to main content
×
Home

Robots and Respect: Assessing the Case Against Autonomous Weapon Systems

Abstract

There is increasing speculation within military and policy circles that the future of armed conflict is likely to include extensive deployment of robots designed to identify targets and destroy them without the direct oversight of a human operator. My aim in this paper is twofold. First, I will argue that the ethical case for allowing autonomous targeting, at least in specific restricted domains, is stronger than critics have typically acknowledged. Second, I will attempt to defend the intuition that, even so, there is something ethically problematic about such targeting. I argue that an account of the nonconsequentialist foundations of the principle of distinction suggests that the use of autonomous weapon systems (AWS) is unethical by virtue of failing to show appropriate respect for the humanity of our enemies. However, the success of the strongest form of this argument depends upon understanding the robot itself as doing the killing. To the extent that we believe that, on the contrary, AWS are only a means used by combatants to kill, the idea that the use of AWS fails to respect the humanity of our enemy will turn upon an account of what is required by respect, which is essential conventional. Thus, while the theoretical foundations of the idea that AWS are weapons that are “evil in themselves” are weaker than critics have sometimes maintained, they are nonetheless sufficient to demand a prohibition of the development and deployment of such weapons.

Copyright
References
Hide All

NOTES

1 Anderson Kenneth and Waxman Matthew C., “Law and Ethics for Robot Soldiers,” Policy Review 176 (2012); Ronald C. Arkin, Governing Lethal Behavior in Autonomous Robots (Boca Raton Fla.: CRC Press, 2009); Marchant Gary E. et al. , “International Governance of Autonomous Military Robots,” Columbia Science and Technology Law Review 12 (2011); Department of Defense, Unmanned Systems Integrated Roadmap: FY2011–2036 (Washington, D.C.: Department of Defense, 2012); Schmitt Michael N. and Thurnher Jeffrey S., “‘Out of the Loop’: Autonomous Weapon Systems and the Law of Armed Conflict,” Harvard National Security Journal 4, no. 2 (2013); Peter W. Singer, Wired for War: The Robotics Revolution and Conflict in the 21st Century (New York: Penguin Press, 2009); Robert O. Work and Shawn Brimley, 20YY: Preparing for War in the Robotic Age (Centre for a New American Security, 2014). Elsewhere in the literature, AWS are sometimes referred to as Lethal Autonomous Robots (LARs).

2 Human Rights Watch, Losing Humanity: The Case against Killer Robots (2012), www.hrw.org/reports/2012/11/19/losing-humanity-0.

3 Campaign to Stop Killer Robots website, “Campaign to Stop Killer Robots: About Us,” www.stopkillerrobots.org/about-us/.

4 Charli Carpenter, “Beware the Killer Robots: Inside the Debate over Autonomous Weapons,” Foreign Affairs online, July 3, 2013, www.foreignaffairs.com/articles/139554/charli-carpenter/beware-the-killer-robots#.

5 I will leave the task of determining the legality of AWS under international humanitarian law to those better qualified to address it. However, insofar as the just war theory doctrine of jus in bello is developed and expressed in both legal and philosophical texts, I will occasionally refer to the relevant legal standards in the course of my argument, which concerns the ethics of autonomous targeting.

6 For the argument that this problem also impacts on the science and engineering of AWS, see the U.S. Department of Defense, Defense Science Board, The Role of Autonomy in DoD Systems (Washington, D.C.: Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, 2012), pp. 23–24.

7 Sparrow Robert, “Killer Robots,” Journal of Applied Philosophy 24, no. 1 (2007).

8 Human Rights Watch, Losing Humanity, pp. 6–20; Schmitt and Thurnher, “Out of the Loop”; Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Farnham, U.K.: Ashgate Publishing, 2009), pp. 43–45.

9 Heather M. Roff, “Killing in War: Responsibility, Liability, and Lethal Autonomous Robots,” in Fritz Allhoff, Nicholas G. Evans, and Adam Henschke, eds., Routledge Handbook of Ethics and War: Just War Theory in the Twenty-First Century (Milton Park, Oxon: Routledge, 2013); Sparrow, “Killer Robots.”

10 I have discussed this question at more length in Sparrow, “Killer Robots.”

11 U.S. Department of Defense, “DoD Directive 3000.09: Autonomy in Weapon Systems,” Washington, D.C., November 21, 2012.

12 The Phalanx Close-In Weapon System is fitted to U.S. (and U.S. ally) ships to defend them from missiles and aircraft. It uses a sophisticated radar to identify, track, and target incoming threats and a high-speed gatling gun to attempt to destroy them. While the system allows for a manual override, it is designed to engage threats automatically due to the high speed at which engagements must occur.

13 Daniel C. Dennett, The Intentional Stance (Cambridge, Mass.: MIT Press, 1987).

14 Matthias Andreas, “The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata,” Ethics and Information Technology 6, no. 3 (2004).

15 Anderson and Waxman, “Law and Ethics for Robot Soldiers”; Arkin, Governing Lethal Behavior in Autonomous Robots; Marchant et al., “International Governance of Autonomous Military Robots”; Work and Brimley, 20YY. The United States Department of Defense clarified its own policy in relation to the development and use of AWS in “DoD Directive 3000.09: Autonomy in Weapon Systems” (2012), which some (see, for instance, Spencer Ackerman, “Pentagon: A Human Will Always Decide When a Robot Kills You,” Wired, November 26, 2012) have read as prohibiting the use of AWS armed with lethal weapons against human targets. However, see Mark Gubrud, “US Killer Robot Policy: Full Speed Ahead,” Bulletin of the Atomic Scientists, September 20, 2013.

16 Kenneth Anderson and Matthew C. Waxman, “Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can,” Hoover Institution, Jean Perkins Task Force on National Security and Law Essay Series (2013), p. 7; Schmitt and Thurnher, “Out of the Loop,” p. 238.

17 Adams Thomas K., “Future Warfare and the Decline of Human Decisionmaking,” Parameters 31, no. 4 (2001/2002).

18 Schmitt and Thurnher, “Out of the Loop.” The costs of developing and fielding AWS are another matter entirely. Complex information technology systems are notoriously prone to running over budget and delivering less than was promised. Developing the computer software required for autonomy and debugging it effectively may be very expensive indeed.

19 Gubrud Mark, “Stopping Killer Robots,” Bulletin of the Atomic Scientists 70, no. 1 (2014); Human Rights Watch, Losing Humanity; International Committee for Robot Arms Control website, Mission Statement, icrac.net/statements/; Sparrow Robert, “Predators or Plowshares? Arms Control of Robotic Weapons,” IEEE Technology and Society 28, no. 1 (2009).

20 Ronald C. Arkin, “On the Ethical Quandaries of a Practicing Roboticist: A First-Hand Look,” in Adam Briggle, Katinka Waelbers, and Philip Brey, eds., Current Issues in Computing and Philosophy (Amsterdam: IOS Press, 2008); Arkin, Governing Lethal Behavior in Autonomous Robots.

21 Arkin Ronald C., “The Case for Ethical Autonomy in Unmanned Systems,” Journal of Military Ethics 9, no. 4 (2010). See also Marchant et al., “International Governance of Autonomous Military Robots,” pp. 279–81; Department of Defense, Unmanned Systems Integrated Roadmap: FY2011–2036, pp. 43–51.

22 For a critical evaluation of these claims, see Tonkens Ryan, “The Case against Robotic Warfare: A Response to Arkin,” Journal of Military Ethics 11, no. 2 (2012).

23 Human Rights Watch, Losing Humanity; Noel E. Sharkey, “Autonomous Robots and the Automation of Warfare,” International Humanitarian Law Magazine, no. 2 (2012); Sharkey Noel E., “The Evitability of Autonomous Robot Warfare,” International Review of the Red Cross 94, no. 886 (2012).

24 Light Detection And Ranging (LIDAR).

25 For a useful discussion of the ways in which applying the principle of distinction requires assessment of intention and of just how hard this problem is likely to be for a machine, see Marcello Guarini and Paul Bello, “Robotic Warfare: Some Challenges in Moving from Noncivilian to Civilian Theaters,” in Patrick Lin, Keith Abney, and George A. Bekey, eds., Robot Ethics: The Ethical and Social Implications of Robotics (Cambridge, Mass.: MIT Press, 2012). Again, as Guarini and Bello concede and I will discuss further below, in some—very restricted—circumstances it may be reasonable to treat every person carrying a weapon and every weaponized system, within a narrowly defined geographical area, as a combatant.

26 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), adopted at Geneva on June 8, 1977, www.icrc.org/Ihl.nsf/INTRO/470?OpenDocument.

27 Geoffrey S. Corn et al., The Law of Armed Conflict: An Operational Approach (New York: Wolters Kluwer Law & Business, 2012), pp. 115–17.

28 Ibid., pp. 165–66; Additional Protocol I (see note 27).

29 In the course of writing this article, I was fortunate enough to read a draft of a manuscript by Heather Roff addressing the prospects for autonomous weapons meeting the requirements of the principle of distinction. My discussion here has undoubtedly been influenced by her insightful treatment of the topic. Schmitt and Thurnher, “Out of the Loop” also contains a useful discussion of this question.

30 Schmitt and Thurnher, “Out of the Loop,” pp. 265–66; Wagner Markus, “Taking Humans Out of the Loop: Implications for International Humanitarian Law,” Journal of Law Information and Science 21, no. 2 (2011).

31 Sparrow Robert, “Twenty Seconds to Comply: Autonomous Weapon Systems and the Recognition of Surrender,” International Law Studies 91 (2015).

32 That importance of this requirement is noted in Marchant et al., “International Governance of Autonomous Military Robots,” p. 282.

33 Arkin, Governing Lethal Behavior in Autonomous Robots.

34 For an account of what would be required to produce “ethical robots” that is more sympathetic to the idea than I am here, see Wendell Wallach and Colin Allen, Moral Machines: Teaching Robots Right from Wrong (New York: Oxford University Press, 2009).

35 Raimond Gaita, Good and Evil: An Absolute Conception, 2nd ed. (Abingdon, U.K.: Routledge, 2004), pp. 264–82.

36 Arkin, Governing Lethal Behavior in Autonomous Robots, pp. 203–209.

37 Donald P. Brutzman et al., “Run-Time Ethics Checking for Autonomous Unmanned Vehicles: Developing a Practical Approach” (paper presented at the 18th International Symposium on Unmanned Untethered Submersible Technology, Portsmouth, New Hampshire, 2013); Alex Leveringhaus and Tjerk de Greef, “Keeping the Human ‘in-the-Loop’: A Qualified Defence of Autonomous Weapons,” in Mike Aaronson et al., eds., Precision Strike Warfare and International Intervention: Strategic, Ethico-Legal, and Decisional Implications (Abingdon, U.K.: Routledge, 2015).

38 Adams, “Future Warfare and the Decline of Human Decisionmaking.”

39 Anderson and Waxman, “Law and Ethics for Autonomous Weapon Systems,” p. 7; Schmitt and Thurnher, “Out of the Loop,” p. 238; Work and Brimley, 20YY, p. 24.

40 See also Brutzman et al., “Run-Time Ethics Checking for Autonomous Unmanned Vehicles.” This is not to deny that there would be some military advantages associated with the development of such systems, as long as the communications infrastructure necessary to allow contact with a human operator as required was in place. For instance, by removing the need for direct human supervision it would multiply the number of systems that could operate in the context of a given amount of bandwidth and also make it possible for one human operator to oversee the activities of a number of robots.

41 Guarini and Bello, “Robotic Warfare”; Human Rights Watch, Losing Humanity; Sharkey, “The Evitability of Autonomous Robot Warfare.” Indeed, I have argued this myself elsewhere. See Robert Sparrow, “Robotic Weapons and the Future of War,” in Jessica Wolfendale and Paolo Tripodi, eds., New Wars and New Soldiers: Military Ethics in the Contemporary World (Surrey, U.K.: Ashgate, 2011).

42 Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard National Security Journal online, February 5, 2013.

43 Brutzman et al., “Run-Time Ethics Checking for Autonomous Unmanned Vehicles.”

44 Guarini and Bello, “Robotic Warfare.”

45 Leveringhaus and De Greef, “Keeping the Human ‘in-the-Loop.’”

46 To the extent that this requirement is not recognized by the Law of Armed Conflict, the legal barriers to the ethical use of AWS in sufficiently restricted domains will be correspondingly lower.

47 Arkin Ronald C., “Lethal Autonomous Systems and the Plight of the Non-Combatant,” AISB Quarterly 137 (2013).

48 Arkin, “The Case for Ethical Autonomy in Unmanned Systems.”

49 The argument in this paragraph owes much to remarks made by Daniel Brunstetter in a session on drone warfare at the International Studies Association Annual Convention in San Francisco in April 2013. See also Braun Megan and Brunstetter Daniel R., “Rethinking the Criterion for Assessing CIA-Targeted Killings: Drones, Proportionality and Jus Ad Vim ,” Journal of Military Ethics 12, no. 4 (2013). George Lucas subsequently brought it to my attention that he had in fact rehearsed this argument in a paper in 2011. See Lucas George R. Jr, “Industrial Challenges of Military Robotics,” Journal of Military Ethics 10, no. 4 (2011).

50 Lucas Jr, “Industrial Challenges of Military Robotics.”

51 Brian G. Williams, Predators: The CIA's Drone War on Al Qaeda (Washington, D.C.: Potomac Books, 2013).

52 Williams's argument proceeds by providing evidence of actual consent but in most cases the argument will need to proceed by way of reference to “hypothetical consent”—that is, what civilians in the area of operations would prefer.

53 Sparrow, “Robotic Weapons and the Future of War.”

54 Singer, Wired for War, p. 319; Sparrow, “Predators or Plowshares?”

55 Kahn Paul W., “The Paradox of Riskless Warfare,” Philosophy & Public Policy Quarterly 22, no. 3 (2002).

56 Strawser Bradley J., “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles,” Journal of Military Ethics 9, no. 4 (2010).

57 Nagel Thomas, “War and Massacre,” Philosophy & Public Affairs 1, no. 2 (1972).

58 Ibid., p. 136.

59 Ibid., p. 138.

60 Asaro Peter, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making,” International Review of the Red Cross 94, no. 886 (2012), p. 701 .

61 Asaro, “On Banning Autonomous Weapon Systems.”

62 Mary Ellen O'Connell, “Banning Autonomous Killing: The Legal and Ethical Requirement That Humans Make Near-Time Lethal Decisions,” in Matthew Evangelista and Henry Shue, eds., The American Way of Bombing: Changing Ethical and Legal Norms, from Flying Fortresses to Drones (Ithaca, N.Y.: Cornell University Press, 2014).

63 Arkin, “Lethal Autonomous Systems and the Plight of the Non-Combatant”; Kanwar Vik, “Post-Human Humanitarian Law: The Law of War in the Age of Robotic Weapons,” Harvard National Security Journal 2, no. 2 (2011), pp. 619–20; Schmitt and Thurnher, “Out of the Loop,” p. 268.

64 She should, of course, be appropriately confident that the persons at the facility or location they are attacking are legitimate targets.

65 Indeed, the stockpiling and use of antipersonnel mines at least is prohibited by the Ottawa Treaty.

66 Defense Science Board, The Role of Autonomy, pp. 1–2.

67 Sparrow, “Killer Robots.”

68 Roff, “Killing in War.”

69 Matthias, “The Responsibility Gap.”

70 Hellström Thomas, “On the Moral Responsibility of Military Robots,” Ethics and Information Technology 15, no. 2 (2013); Leveringhaus and De Greef, “Keeping the Human ‘in-the-Loop’”; Gert-Jan Lokhorst and Jeroen van den Hoven, “Responsibility for Military Robots,” in Lin, Abney, and Bekey, Robot Ethics; Sparrow, “Killer Robots.”

71 Nagel, “War and Massacre,” p. 135, note 7.

72 Charli Carpenter, “US Public Opinion on Autonomous Weapons” (University of Massachusetts, 2013), www.duckofminerva.com/wp-content/uploads/2013/06/UMass-Survey_Public-Opinion-on-Autonomous-Weapons.pdf.

73 Gubrud, “Stopping Killer Robots,” p. 40; Johnson Aaron M. and Axinn Sidney, “The Morality of Autonomous Robots,” Journal of Military Ethics 12, no. 2 (2013); Sparrow, “Robotic Weapons and the Future of War.”

74 Asaro, “On Banning Autonomous Weapon Systems.”

75 Schmitt and Thurnher, “Out of the Loop,” p. 10.

76 Gubrud, “Stopping Killer Robots”; Human Rights Watch, Losing Humanity; Wendell Wallach, “Terminating the Terminator: What to Do About Autonomous Weapons,” Science Progress, January 29, 2013. The legal basis for doing so might be found in the “Martens Clause” in the Hague Convention, which prohibits weapons that are “contrary to the dictates of the public conscience.” Human Rights Watch, Losing Humanity.

77 Arkin, “Lethal Autonomous Systems and the Plight of the Non-Combatant”; Anderson and Waxman, “Law and Ethics for Robot Soldiers”; Anderson and Waxman, “Law and Ethics for Autonomous Weapon Systems”; Schmitt and Thurnher, “Out of the Loop.”

78 The moral status of nuclear weapons remains controversial in some quarters. However, the last two decades of wars justified with reference to states’ possession of “weapons of mass destruction” suggests that there is an emerging international consensus that such weapons are mala in se.

79 Johnson and Axinn, “The Morality of Autonomous Robots,” p. 137.

80 Should this campaign fail, it is possible that public revulsion at sending robots to kill people will be eroded as AWS come into use and become a familiar feature of war—as has occurred with a number of weapons, including artillery and submarines, in the past. In that case, the argument that such killing disrespects the humanity of our enemies will eventually lapse as the social conventions around respect for the humanity of combatants are transformed. It might be argued that even if a prohibition on AWS is achieved, conventional understandings of the appropriate relations between humans and robots may shift in the future as people become more familiar with robots in civilian life. While this cannot be ruled out a priori, I suspect that it is more likely that outrage at robots being allowed to kill humans will only intensify as a result of the social and psychological incentives to maintain the distinction between “us” and “them.”

81 Arkin, “Lethal Autonomous Systems and the Plight of the Non-Combatant.”

82 For cynicism about the prospect of such, see Anderson and Waxman, “Law and Ethics for Autonomous Weapon Systems”; Arkin, “Lethal Autonomous Systems and the Plight of the Non-Combatant”; Marchant et al., “International Governance of Autonomous Military Robots.” For a countervailing perspective, see O'Connell, “Banning Autonomous Killing.”

83 Wallach Wendell and Allen Colin, “Framing Robot Arms Control,” Ethics and Information Technology 15, no. 2 (2013); Schmitt and Thurnher, “Out of the Loop.”

84 For an important contribution to this project, see Altmann Jürgen, “Arms Control for Armed Uninhabited Vehicles: An Ethical Issue,” Ethics and Information Technology 15, no. 2 (2013).

* Thanks are due to Ron Arkin, Daniel Brunstetter, Ryan Jenkins, Mark Gubrud, Duncan Purves, Heather Roff, B. J. Strawser, Michael Schmitt, Ryan Tonkens, several anonymous referees, and the editors of Ethics & International Affairs for comments on drafts of this manuscript. Mark Howard ably assisted me with sources and with preparing the article for publication.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Ethics & International Affairs
  • ISSN: 0892-6794
  • EISSN: 1747-7093
  • URL: /core/journals/ethics-and-international-affairs
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Type Description Title
WORD
Supplementary Materials

Sparrow supplementary material
Sparrow supplementary material 1

 Word (13 KB)
13 KB

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 95
Total number of PDF views: 533 *
Loading metrics...

Abstract views

Total abstract views: 1403 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 24th November 2017. This data will be updated every 24 hours.