Skip to main content
×
Home

The evitability of autonomous robot warfare*

Abstract
Abstract

This is a call for the prohibition of autonomous lethal targeting by free-ranging robots. This article will first point out the three main international humanitarian law (IHL)/ethical issues with armed autonomous robots and then move on to discuss a major stumbling block to their evitability: misunderstandings about the limitations of robotic systems and artificial intelligence. This is partly due to a mythical narrative from science fiction and the media, but the real danger is in the language being used by military researchers and others to describe robots and what they can do. The article will look at some anthropomorphic ways that robots have been discussed by the military and then go on to provide a robotics case study in which the language used obfuscates the IHL issues. Finally, the article will look at problems with some of the current legal instruments and suggest a way forward to prohibition.

Copyright
Footnotes
Hide All
*

The title is an allusion to a short story by Isaac Asimov, ‘The evitable conflict’, where ‘evitable’ means capable of being avoided. Evitability means avoidability.

Footnotes
References
Hide All

1 Sharkey Noel, ‘The automation and proliferation of military drones and the protection of civilians’, in Journal of Law, Innovation and Technology, Vol. 3, No. 2, 2001, pp. 229240.

2 Sharkey Noel, ‘Cassandra or the false prophet of doom: AI robots and war’, in IEEE Intelligent Systems, Vol. 23, No. 4, 2008, pp. 1417.

3 Article 50(1) of the Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts, 8 June 1977 (hereinafter Additional Protocol I)

4 As a scientist I cannot exclude the notion that some black swan event could change my scepticism, but at present we certainly cannot rely on this as a credible option in discussions of lethal force and the protection of innocents.

5 See Sharkey Noel E., ‘Grounds for Discrimination: Autonomous Robot Weapons’, in RUSI Defence Systems, Vol. 11, No. 2, 2008, pp. 8689.

6 Arkin Ronald C., Governing Lethal Behavior in Autonomous Systems, CRC Press Taylor & Francis Group, Boca Raton F.L., 2009, pp. 4748.

7 Anderson Mark, ‘How Does a Terminator Know When to Not Terminate’, in Discover Magazine, May 2010, p. 40.

8 Anderson Kenneth and Waxman Matthew, ‘Law and Ethics of Robot Soldiers’, in Policy Review, in press 2012.

9 See R. C. Arkin, above note 6.

10 Sparrow Robert, ‘Building a Better WarBot: Ethical Issues in the Design of Unmanned Systems for Military Applications’, in Science and Engineering Ethics, Vol. 15, No. 2, 2009, pp.169187.

11 See K. Anderson and M. Waxman, above note 8.

12 Sharkey Amanda and Sharkey Noel, ‘Artificial Intelligence and Natural Magic’, in Artificial Intelligence Review, Vol. 25, No. 1–2, 2006, pp. 919.

13 Garreau Joel, ‘Bots on the Ground’, in Washington Post, 6 May 2007, available at: http://www.washingtonpost.com/wp-dyn/content/article/2007/05/05/AR2007050501009.html (last visited January 2012).

14 Mark Tilden, personal communication and briefly reported in ibid.

15 Weiner Tim, ‘New model arm soldier rolls closer to battle’, in New York Times, 16 February 2005, available at: http://www.nytimes.com/2005/02/16/technology/16robots.html (last visited January 2012).

16 Marchant Gary E., Allenby Braden, Arkin Ronald, Barrett Edward T., Borenstein Jason, Gaudet Lyn M., Kittrie Orde, Lin Patrick, Lucas George R., O'Meara Richard, Silberman Jared, ‘International governance of autonomous military robots’, in The Columbia Science and Technology Law Review, Vol. 12, 2011, pp. 272315.

17 See K. Anderson and M. Waxman, above note 8.

18 McDermott Drew, ‘Artificial Intelligence Meets Natural Stupidity’, in Haugland J. (ed.), Mind Design , MIT Press, Cambridge, 1981, pp. 143160.

19 See R. C. Arkin, above note 6, pp. 47–48.

20 Arkin Ronald C., ‘Ethical Robots in Warfare’, in IEEE Technology and Society Magazine, Vol. 28, No. 1, Spring 2009, pp. 3033.

21 E.g. Murphy Robin and Woods David, ‘Beyond Asimov: the three laws of responsible robotics’, in IEEE Intelligent Systems, Vol. 24, No. 4, July–August 2009, pp. 1420; Wallach Wendell and Allen Colin, Moral Machines: Teaching Robots Right from Wrong, Oxford University Press, New York, 2009.

22 See R. Murphy and D. Woods, ibid.

23 Task Force Report, ‘The Role of Autonomy in DoD Systems’, Department of Defense – Defense Science Board, July 2012, p. 48, available at: http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf (last visited January 2012).

24 See R. C. Arkin, above note 6, p. 174.

25 Ibid., p. 91.

26 Ibid., p. 176.

27 Ibid., p. 172.

28 Ibid., p. 259.

29 Ibid., p. 174.

30 Ibid., p. 178.

31 Ibid., p. 47.

32 McClelland Justin, ‘The review of weapons in accordance with Article 36 of Additional Protocol’, in International Review of the Red Cross, Vol. 85, No. 850, 2003, pp. 397415.

33 I am uncomfortable with this expansion of the automation of killing for a number of other reasons that there is not space to cover in this critique. See, for example, Sharkey Noel E., ‘Saying — No! to Lethal Autonomous Targeting’, in Journal of Military Ethics, Vol. 9, No. 4, 2010, pp. 299313.

34 See also the statement of the International Committee for Robot Arms Control (ICRAC), at the Berlin Expert Workshop, September 2010, available at: http://icrac.net/statements/ (last visited 1 June 2012).

35 Additional Protocol I. This was not signed by the US.

36 There is a touch of the hat to the idea that there may be ethical issues in the ‘unmanned systems integrated road map 2009–2034’, but no detailed studies of the law or the issues are proposed.

37 United Nations Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, in force since 2 December 1983 and an annex to the Geneva Conventions of 12 August 1949. I thank David Akerson for discussions on this issue.

38 Boyne Walter J., ‘How the predator grew teeth’, in Airforce Magazine, Vol. 92, No 7, July 2009, available at: http://bit.ly/RT78dP (last visited January 2012).

39 International Court of Justice (ICJ), Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion of 8 July 1996, available at: http://www.icj-cij.org/ (last visited 16 May 2012).

40 Ibid., para. 105, subpara. 2E.

41 See N. Sharkey, above note 2.

42 Valery Insinna, ‘Drone strikes in Yemen should be more controlled, professor says’, interview with Christopher Swift for the National Defence Magazine, 10 October 2006, available at: http://tinyurl.com/8gnmf7q (last visited January 2012).

* The title is an allusion to a short story by Isaac Asimov, ‘The evitable conflict’, where ‘evitable’ means capable of being avoided. Evitability means avoidability.

** Thanks for comments on earlier drafts go to Colin Allen, Juergen Altmann, Niall Griffith, Mark Gubrud, Patrick Lin, George Lucas, Illah Nourbakhsh, Amanda Sharkey, Wendell Wallach, Alan Winfield, and to editor-in-chief Vincent Bernard and the team of the International Review of the Red Cross, as well as others who prefer to remain unnamed.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

International Review of the Red Cross
  • ISSN: 1816-3831
  • EISSN: 1607-5889
  • URL: /core/journals/international-review-of-the-red-cross
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 16
Total number of PDF views: 116 *
Loading metrics...

Abstract views

Total abstract views: 671 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 24th November 2017. This data will be updated every 24 hours.