Skip to main content
    • Aa
    • Aa

On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making


This article considers the recent literature concerned with establishing an international prohibition on autonomous weapon systems. It seeks to address concerns expressed by some scholars that such a ban might be problematic for various reasons. It argues in favour of a theoretical foundation for such a ban based on human rights and humanitarian principles that are not only moral, but also legal ones. In particular, an implicit requirement for human judgement can be found in international humanitarian law governing armed conflict. Indeed, this requirement is implicit in the principles of distinction, proportionality, and military necessity that are found in international treaties, such as the 1949 Geneva Conventions, and firmly established in international customary law. Similar principles are also implicit in international human rights law, which ensures certain human rights for all people, regardless of national origins or local laws, at all times. I argue that the human rights to life and due process, and the limited conditions under which they can be overridden, imply a specific duty with respect to a broad range of automated and autonomous technologies. In particular, there is a duty upon individuals and states in peacetime, as well as combatants, military organizations, and states in armed conflict situations, not to delegate to a machine or automated process the authority or capability to initiate the use of lethal force independently of human determinations of its moral and legal legitimacy in each and every case. I argue that it would be beneficial to establish this duty as an international norm, and express this with a treaty, before the emergence of a broad range of automated and autonomous weapons systems begin to appear that are likely to pose grave threats to the basic rights of individuals.

    • Send article to Kindle

      To send this article to your Kindle, first ensure is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.

      Note you can select to send to either the or variations. ‘’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making
      Available formats
      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about sending content to Dropbox.

      On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making
      Available formats
      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about sending content to Google Drive.

      On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making
      Available formats
Hide All

2 Altmann Jürgen, Asaro Peter, Sharkey Noel and Sparrow Robert, Mission Statement of the International Committee for Robot Arms Control, 2009, available at: (this and all links last visited June 2012).

3 Arkin Ronald C., Governing Lethal Behavior in Autonomous Robots, CRC Press, 2009; Gary Marchant, Braden Allenby, Ronald C. Arkin, Edward T. Barrett, Jason Borenstein, Lyn M. Gaudet, Orde F. Kittrie, Patrick Lin, George R. Lucas, Richard M. O'Meara and Jared Silberman, ‘International governance of autonomous military robots’, in Columbia Science and Technology Law Review, 30 December 2010, available at:; Kenneth Anderson and Matthew C. Waxman, ‘Law and ethics for robot soldiers’, in Policy Review, 28 April 2012, available at:

4 The human rights currently recognized in international law include, but are not limited to, the rights enshrined in the United Nations International Bill of Human Rights, which contains the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights, and the International Covenant on Economic, Social and Cultural Rights.

5 The term ‘autonomous’ is used by engineers to designate systems that operate without direct human control or supervision. Engineers also use the term ‘automated’ to distinguish unsupervised systems or processes that involve repetitive, structured, routine operations without much feedback information (such as a dishwasher), from ‘robotic’ or ‘autonomous’ systems that operate in dynamic, unstructured, open environments based on feedback information from a variety of sensors (such as a self-driving car). Regardless of these distinctions, all such systems follow algorithmic instructions that are almost entirely fixed and deterministic, apart from their dependencies on unpredictable sensor data, and narrowly circumscribed probabilistic calculations that are sometimes used for learning and error correction.

6 I use the term ‘autonomous weapon system’ rather than simply ‘autonomous weapon’ to indicate that the system may be distributed amongst disparate elements that nonetheless work together to form an autonomous weapon system. For instance, a computer located almost anywhere in the world could receive information from a surveillance drone, and use that information to initiate and direct a strike from a guided weapon system at yet another location, all without human intervention or supervision, thereby constituting an autonomous weapon system. That is, the components of an autonomous weapon system – the sensors, autonomous targeting and decision-making, and the weapon – need not be directly attached to each other or co-located, but merely connected through communications links.

7 Jakob Kellenberger, ‘Keynote Address’, International Humanitarian Law and New Weapon Technologies, 34th Round Table on Current Issues of International Humanitarian Law, San Remo, Italy, 8–10 September 2011, pp. 5–6, available at:

8 Paul Scharre, ‘Why unmanned’, in Joint Force Quarterly, Issue 61, 2nd Quarter, 2011, p. 92.

9 Asaro Peter, ‘How just could a robot war be?’, in Briggle Adam, Waelbers Katinka and Brey Philip A. E. (eds), Current Issues in Computing And Philosophy, IOS Press, Amsterdam, 2008, pp. 5064, available at:

10 By analogy, one should consider the stock market ‘Flash Crash’ of 6 May 2010, in which automated high-frequency trading systems escalated and accelerated a 1,000-point drop in the Dow Jones average (9%), the single largest drop in history. See Wikipedia, ‘Flash Crash’, available at:

11 Sharkey Noel, ‘Death strikes from the sky: the calculus of proportionality’, in IEEE Technology and Society Magazine, Vol. 28, No. 1, 2009, pp. 1619; Sharkey Noel, ‘Saying “no!” to lethal autonomous targeting’, in Journal of Military Ethics, Vol. 9, No. 4, 2010, pp. 369383; Wagner Markus, ‘Taking humans out of the loop: implications for international humanitarian law’, in Journal of Law Information and Science, Vol. 21, 2011, available at:; Matthew Bolton, Thomas Nash and Richard Moyes, ‘Ban autonomous armed robots’,, 5 March 2012, available at:

12 See in particular, Articles 51 and 57 of Additional Protocol I to the Geneva Conventions address the protection of the civilian population and precautions in attack. Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts, 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978), available at:

13 The full text of Article 36 of Additional Protocol I on New Weapons reads: ‘In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party’.

14 O'Meara Richard M., ‘Contemporary governance architecture regarding robotics technologies: an assessment’, in Lin Patrick, Abney Keith and Bekey George, Robot Ethics, MIT Press, Cambridge MA, 2011, pp. 159168.

15 Sparrow Robert, ‘Killer robots’, in Journal of Applied Philosophy, Vol. 24, No. 1, 2007, pp. 6277.

16 M. Wagner, above note 11, p. 5.

17 In the language of Article 36 of Additional Protocol I to the Geneva Conventions, autonomous weapon systems are subject to review on the basis of being a ‘new weapon, means or method of warfare’. This implies that using an existing approved weapon in a new way, i.e. with autonomous targeting or firing, is itself subject to review as a new means or method.

18 Julian C. Cheater, ‘Accelerating the kill chain via future unmanned aircraft’, Blue Horizons Paper, Center for Strategy and Technology, Air War College, April 2007, p. 5, available at:

19 R. C. Arkin, above note 3, pp. 71–91.

20 J. Kellenberger, above note 7, p. 6

21 Indeed, there is a tendency in the literature on autonomous weapons to refer to ‘discrimination’ rather than the principle of distinction, which connotes the ‘discrimination task’ in cognitive psychology and artificial intelligence. See Noel Sharkey's opinion note in this volume.

22 Mezler Nils, Interpretive Guidance on the Notion of Direct Participation in Hostilities Under International Humanitarian Law, ICRC, Geneva, 2009, p. 20, available at:

23 Idem., pp. 50–64.

24 Burridge Brian, ‘UAVs and the dawn of post-modern warfare: a perspective on recent operations’, in RUSI Journal, Vol. 148, No. 5, October 2003, pp. 1823.

25 Citron Danielle Keats, ‘Technological due process’, in Washington University Law Review, Vol. 85, 2008, pp. 12491292.

26 Ronald C. Arkin, ‘Governing lethal behavior: embedding ethics in a hybrid deliberative/reactive robot architecture’, Georgia Institute of Technology, Technical Report GUT-GVU-07-11, 2007, p. 11.

27 Human Rights Watch, ‘International humanitarian law issues in the possible U.S. invasion of Iraq’, in Lancet, 20 February 2003.

28 Strawser Bradley Jay, ‘Moral predators: the duty to employ uninhabited aerial vehicles’, in Journal of Military Ethics, Vol. 9, No. 4, 2010, pp. 342368.

29 Asaro Peter, ‘Modeling the moral user: designing ethical interfaces for tele-operation’, in IEEE Technology & Society, Vol. 28, No. 1, 2009, pp. 2024, available at:

30 K. Anderson and M. C. Waxman, above note 3, p. 13.

31 Idem., p. 2.

32 M. Wagner, above note 11, pp. 5–9.

33 For comparison, consider electric cars, a technology that has existed for a century. Even with the recent popularity of hybrid gas/electric cars, and some highly capable electric cars, few people would endorse the claim that our transition to electric cars is inevitable. And this is a technology that is already possible, i.e. it exists.

34 P. Asaro, above note 29, pp. 20–24.

35 K. Anderson and M. C. Waxman, above note 3, p. 2.

36 Idem., p. 11.

37 M. Bolton, T. Nash and R. Moyes, above note 11.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

International Review of the Red Cross
  • ISSN: 1816-3831
  • EISSN: 1607-5889
  • URL: /core/journals/international-review-of-the-red-cross
Please enter your name
Please enter a valid email address
Who would you like to send this to? *



Altmetric attention score

Full text views

Total number of HTML views: 30
Total number of PDF views: 336 *
Loading metrics...

Abstract views

Total abstract views: 993 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 23rd October 2017. This data will be updated every 24 hours.