[SystemSafety] Drones [ was: Short and profound]

From: Peter Bernard Ladkin < >
Date: Thu, 28 May 2015 08:24:03 +0200


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

On 2015-05-28 02:11 , Les Chambers wrote:
> .... I turn on my radio this morning and hear Stuart Russell, Professor of Computer Science
> at the University of California, Berkley speaking about lethal autonomous weapons systems
> (LAWS).
That would be Berkeley, as in Bishop George, but pronounced differently. And that would be Stuart, old pal and second signature on my Ph.D. dissertation (first one he ever signed, shortly after getting his Berkeley job).

> He has just published a an article in the journal Nature.

http://www.nature.com/news/robotics-ethics-of-artificial-intelligence-1.17611

It's a discussion amongst four of the ethics of robotics/AI.

I think he addresses a pressing issue. The can of worms having been opened, let me select choice ones for connoisseurs.

Let me recommend Drone Theory by Gregoire Chamayou (Penguin) as it is in the UK; Theory of the Drone (Free Press) as it is in the US, a translation by Janet Lloyd from the French. Chamayou is a philosopher with the French state research organisation CNRS.

I am strongly in favor of transparent, well-informed public debate on most issues. Chamayou's book aids this enormously because of his obvious talent, apparent even in translation, for writing lucidly. His book is easier to read than most newspaper articles, even while exhibiting appropriate intellectual rigor.

He considers many of the issues in 23 short chapters of 5-9pp each. He considers various "principles of war", principle which have been advocated through the ages which attempt to justify humans killing other humans without it being considered criminal (i.e. slaughter that is not murder), and to legitimise states requiring them to do that. One of the justifications is the special status of battle, in which soldiers are placed in an aggressive environment at risk to themselves in which the way to minimise the risk is to prevail. This mitigates any attribution of murder to the individual fighter, in that heshe did not voluntarily choose that situation and is generally entitled to act in self-defence (this is also a defence against a civilian charge of murder). But it is hard to see how it legitimises a state decision to put large numbers of people in that situation involuntarily; indeed it doesn't - that action is legitimised if at all by considerations of statecraft and the state's legitimacy as a general defender of its citizens, as if you like a tactic of the larger obligation to defend one's citizens.

Most of us, even most of us philosophers, are not familiar with the philosophy of war. There is a tradition which comes through Kant and Clausewitz through Michael Walzer's 20th century classic Just and Unjust Wars and to the present day.

Chamayou points out many of the ways in which targeting killings through drones do not fulfil the criteria for legimated killing within war in this philosophical tradition.

The reality of drone operations are a long way from fitting in this high-falutin' intellectual tradition. Operators are not at any personal risk, and conduct assassinations of often unidentified persons who become targets often just through general patterns of behavior, described with examples by Chamayou. Sort of like the old police behavior of harassing people hanging around in groups on street corners of US cities - except this time citizens of a foreign country with which the US is not at war, and not harassment but killing.

In other words, there is a lot of material here to be subject to questioning by the thoughtful citizen who might wish to exercise some kind of democratic influence over a government which is engaging in this kind of activity, such as the USA, the UK and possibly Israel. One influence is public debate. Another influence is the organised expression of strong opinion.

> LAWS are the antithesis of functional safety, designed to kill people rather than protect them.
> He says the Israelis already have a LAW trained to loiter in the vicinity of the enemy, lock
> onto radar signatures and take out the installation without further involvement from a human
> being (even if it's set up in a preschool). Of course he'd like to see LAWS banned.
>
> Applying Okri's sentence, I'd offer that: engineering involvement in LAWS development is a
> classical bad idea. Further, the profession should have a code of ethics that prohibits it.

So, let me offer the standard response.

  1. Suppose there is a future war. Let us suppose it is classic, between states X and Y.
  2. State X has a duty to defend and protect its citizens.
  3. It follows from this that X might need to undertake aggressive action which risks some of its citizens/soldiers
  4. It is generally agreed to be legitimate to undertake such action while minimising the risks to own combatants.
  5. Eliminating specific agents of state Y, that is, assassination, can be one way of fulfilling 2.
  6. Eliminating specific resources of state Y, weapons or munitions, can be another way of fulfilling 2.
  7. Using drones to perform tasks 5 and 6 fulfils 4; using human soldiers cannot fulfil 4 as well.

One of the main problems with the current use of drones, as Chamayou points out, is that their targeting is far more inexact than with using humans. Humans use rifles, which take out individual persons, or grenades, which have a destructive radius of 2-3 meters. Whereas Hellfire missiles, used by the current generation of drones, have a destructive radius of ten times that. So there is inevitably more "collateral damage", violating point 8:

8. It is generally agreed to be legitimate to undertake such action only under the obligation to minimise the risk to "non-combatants".

The use of drone attacks fulfils 4 well, but at the same time is delegitimised by not minimising the risk to "non-combatants"; other means perform far better.

(What this means is that there is motivation to redefine the notion of "combatant". And that's been happening, as Chamayou shows. Which suggests a faintly rosy lining, namely those engaging in this behavior often acknowledge the legitimacy of point 8. Some don't, however. From Chamayou I understand that Asa Kasher and Amos Yadlin revise point 8 (Military Ethics of Fighting Terror, J. Military Ethics 4(1):3-32, 2005) but I haven't read the source. Chamayou says Avishai Margalit and Michael Walzer strongly disagree with Kasher and Yadlin's theory (Israel: Civilians and Combatants, NYRB, May 14, 2009); again, I haven't read the source.)

It doesn't obviously follow from any of this that engineering development of aggressive drones is morally wrong. Just as it doesn't obviously follow that engineering development of tanks or armored personnel carriers is morally wrong. Indeed, one could note that the debate on drones echoes two debates of almost exactly a hundred years ago. One is the use of tanks (seen as enabling enormous destructive power at little risk to operators); another is the use of aircraft against not-obviously-military targets (seen as aggression against non-combatants).

This is all tough stuff, and some of the now-public examples are gut-wrenching. Readers please note that I have largely surpressed my view of the rightness or wrongness of any specific behavior or device - I have tried rather to introduce some issues, in line with my view of the value of well-informed public debate.

There are quite a lot of people now involved with the ethics of drone warfare. Here's another philosopher http://bradleystrawser.com I am not familiar with much of it, but might become so.

Another pressing issue on which I still think there has been a dearth of well-informed public debate, especially in the UK, is surveillance. Here let me recommend Bruce Schneier's latest, Data and Goliath, W. W. Norton and co., 2015.

PBL Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de

-----BEGIN PGP SIGNATURE----- iQEcBAEBCAAGBQJVZrSCAAoJEIZIHiXiz9k+/eoIAJU9kO4UE1jhZr6ky4g65Ub8 JeiBR0VcQm2hUCy+fSgdAe7HcwxdbCoIg9zz9B30Qbn/Tox0Lg2S+eIlZ9aWvtEa oEaYwpHIqG4v1/qyN/MYhMgfppzAe4fc73e4OoKUCwm4G9xRQflvFSUVzrvZSrWV Nn9Fp9z7hSNU3KYfhlgNRKOnOqx5/eiczDhKKY/MO+bLzBkeqm0rH8fKvBZP2jcq 4VWB8+EyeuMQKJqn1FTPP3uihtdwEHDyUxbdqiYuKc3j/hb5VxKNYNI/TkspacAl a3axCHgEmADQ4Uo7sNsbL1eK3bk4TEnEXOvpAhH0NNVKWBQS3SyR6Wr3lGuVSp8= =+d8H
-----END PGP SIGNATURE-----



The System Safety Mailing List
systemsafety_at_xxxxxx Received on Thu May 28 2015 - 08:24:12 CEST

This archive was generated by hypermail 2.3.0 : Sat Feb 16 2019 - 18:17:07 CET