New MoD document recognises legal, ethical and moral issues raised by use of armed drones

The UK Approach to Unmanned Aircraft Systems (click image to open PDF)

The UK Ministry of Defence has published a new document, to “inform and prompt wider debate” on military unmanned aerial vehicles (UAV), commonly known as drones.   The UK Approach to Unmanned Aircraft Systems is a Joint Doctrine Note (JDN) that examines technological and scientific issues related to current and future use of armed and unarmed drones. It  also sets out, for the first time, what it sees as the legal, moral and ethical issues that arise from using such systems.

Arguing that unmanned aircraft now hold a central role in modern warfare, it states “there is a real possibility that, after many false starts and broken promises, a technological tipping point is approaching that may well deliver a genuine revolution in military affairs.”

The publication of this report is very much to be welcomed, in particular its recognition of the serious moral, ethical and legal issues at stake  with the growing use of  unmanned drones and autonomous systems.   At just over 100 pages long the document covers a lot of ground but in this initial review I want to focus on three particular issues.   

Framing the Debate:  On not calling a spade, a spade

As has been the case for some time, when talking about unmanned drones, the use of the term ‘drone’ is an absolute ‘no no’ within the military.  While ‘unmanned aircraft’ or ‘unmanned aerial system’ is seen as acceptable, the term ‘remotely piloted aircraft’ is suggested as appropriate, the document says,  “when talking to the media.”   While it may well be true that this is in part to avoid confusion, it is to counter, one of the key weaknesses to the future development of UAVs identified by the document: the  “public perception issue”.   By avoiding the term ‘drone’ it is perhaps hoped that negative perception of the ‘killer drones’ variety can simply be avoided. 

The document also argues strongly against the idea that any drones currently under development could or should be called ‘autonomous ‘suggesting instead that they are in fact merely ‘automated’.  “Those [drones] that carry out their entire mission from take-off to landing without human intervention may be said to be fully automated” it argues.  Taking what could be said to be a maxim approach to the issue of autonomy,  the document argues that machines or systems can only truly be called autonomous when they are self aware or their understanding is indistinguishable from humans;  

“Autonomous systems will, in effect, be self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human” says the document.  

This would be a substantially different definition of ‘autonomy’ than is being used by many scientists and companies involved in developing autonomous systems as the document itself recognizes:  “Companies may describe their systems to be autonomous even though they would not be considered as such under the military definition.”

I imagine the reason for taking this position, is again in part for public perception reasons.  However  there are other key reasons for not wanting to label drones as autonomous as the document clearly recognizes: “The distinction between autonomous and automated is important as there are moral, ethical and legal implications regarding the use of autonomous unmanned aircraft.” 

While this new document is an important step forward by the MoD in acknowledging that there are legal, ethical and moral issues associated with the growing use of drones, at the same time the document wants to frame the debate and keep it on its own terms.  

Humans:  In, on, or out of the loop?

Legally humans are required to make the final decision with regard to firing of weapons from drones.  This is known as humans being ‘in the loop’.  However we know that industry is developing systems that will mean humans moving from being ‘in the loop’ to  being ‘on the loop’,  that is monitoring several armed drones at the same time. The new document notes this change and acknowledges that the growing development of autonomous (sorry, automated) drones, means that the legal requirement is “being eroded”.  

At one point the document tries to clearly states  that the MoD has no plans to enable drones to independently make decisions about firing it weapons:

“It should be noted that the MOD currently has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective”

But the issue of drones deciding themselves whether to launch weapons is not completely ruled out as this key passage, shows:    

“A human-authorised [drone] attack would be no different to that by a manned aircraft and would be fully compliant with the LOAC [Laws of Armed Conflict], provided the human believed that, based on the information available, the attack met LOAC requirements and extant ROE [Rules of Engagement].  From this position, it would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority. Provided it could be shown that the controlling system appropriately assessed the LOAC principles (military necessity; humanity; distinction and proportionality) and that ROE were satisfied, this would be entirely legal.

 In practice, such operations would present a considerable technological challenge and the software testing and certification for such a system would be extremely expensive as well as time consuming. Meeting the requirement for proportionality and distinction would be particularly problematic, as both of these areas are likely to contain elements of ambiguity requiring sophisticated judgement. Such problems are particularly difficult for a machine to solve and would likely require some form of artificial intelligence to be successful. Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years, with some outliers far later than this…  Until such a capability is achieved it is likely that, apart from some niche tasks, human intervention will continue to be required at key stages of an unmanned aircraft’s mission if it involves weapon-delivery.”

There are very serious legal not to mention ethical and moral issues raised by the prospect of unmanned systems deciding themselves whether to launch their weapons.  The MoD’s  assurances that they are not currently, as they put it developing these systems, while at the same time blurring the distinction between ‘autonomous’ and ‘automated’ is unhelpful.  This together with the fact that exploration  into the “the technological challenge” to achieve such a capability appears to be continuing is extremely worrying.   It would be helpful if the MoD  simply, clearly and unambiguously ruled out the idea of humans being ‘out of the loop’ when it comes to launching weapons. 

Will Remote War means More War?

We have argued for some time that the geographical and psychological  distance between the drone operator launching weapons and the point of attack may mean in practice that the threshold for launching weapons may be reduced.  In addition, the fact that remote war is undertaken at no risk to your own forces also may mean that there is a greater temptation to undertake armed attacks and assassinations.  The authors of the document raises this issue too in a section on ethical and moral issues:

“One of the contributory factors in controlling and limiting aggressive policy is the risk to one’s own forces. It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, or at least keeping it at a distance, that we do not risk losing our controlling humanity and make war more likely.”

However, the document also argues that this negative must be “tempered” by the fact that “the use of unmanned aircraft prevents the potential loss of aircrew lives and is thus in itself morally justified.”

 The authors argue that “what is needed is a clear understanding of the issues involved so that informed decisions can be made.”  We would, of course support this point and would argue that in a democratic society it should not be a matter for military or legal experts to make a these important  decision but there needs to be a genuine public debate.

Further comments

I will be reflecting further on this interesting and fascinating insight in the MoD’s thinking on drones over the next few weeks.  I’d be really interested in your comments too!

5 thoughts on “New MoD document recognises legal, ethical and moral issues raised by use of armed drones

  • excellent site; the issue though is economical, as we have beene xplaining for 20 years now (www.economicstruth.com), the Industrial r=evolution of machines is geared to create organic machines, we made ‘bodies of machines’ in the XIX c., heads of machines (chips=brain, cameras=eyes and mobile=ears) in the XX c. and the XXI c. will see how company-mohters of machines fusion both and make organic robots. Each of those cycles is a generationa 72 years cycle:
    – The age of steam and britain, 1784-1857 (train crash)
    – The age of electrochemical engines and germany, 1857-1929/37 crash
    – The age of america and electronic machines, 1929-2001/0s crash
    Then companies switch to war machines and start an age of keynesian militarism (1860s wars and colonialism witha rmored trains and vaporettes; 1930s fascism with tanks instead of cars; 2000s onwards with terminators instead of pcs)
    Moreover at any time in history the top predator machine aka weapon has been the most expensive, advanced so tanks are advanced cars, and after a crash the same companies do keynesian militarism and make weapons. So industrial corporations now are moving to create blue collar robots and drones that throw humans out of labor and war fields. Those biological models i discovered when studying at columbia U. 20 years ago, predicted the 2001-08 crisis and present keynesianism, but they are basically censored/ignored as money is not interested. robots make profits both in labor and war fields. And marketing will sell them till we arrive to a terminator situation with robots having survival programs and being designed to kill man. And they will. If there were no censorship on sustainable economics and alternative ideologies of humanity maybe we could survive this 72 years cycle which as previous 72 years cycles of the machine evolution means by 2073, the planet will have fully armed, conscious robots and probably we will be extinct
    regards
    luis sancho

  • As a user of small UAS for research there is a very distinct difference between automated and autonomous. Once you’ve spent hours planning a flight and simulating it before uploading to fly an automated flight it is very clear these platforms are not autonomous. The MoD is probably still well ahead of the curve discussing the potential weapons control without human intervention. Anybody working with feature oriented remote sensing will be able to attest to the difficulty of accurately associating digital data with real world features. The capacity for realtime is a long way off in a realworld fluid analogue situation.

  • Very interesting website. Indeed, there are ethical issues to consider as wars are conducted at a distance, but the issues are surely no more than the same as when the first tanks appeared in battle during WW1. If you take the argument right back then we would only be fighting each other with fists.

    I think it is inevitable that we are on an unstoppable journey towards robot armies and the fierst stage of this process are the drones that we are seeing now being used in Afganistan and elsewhere. Another type of robot that I think we will see in the future is a half human and half robot soldier. The Terminators in the James Cameron film may not have arrived in 2011 like he said but I think he only got his dates wrong by about 20 years !

Leave a Reply