Drones everywhere

Obama: Sending Drones to Bomb Libya

It’s been a busy week!  On Monday the Guardian highlighted the MoD’s  new document on the need for serious discussion on the ethics and legality of using drones (see post below), a story which was taken up by the Daily Mail, The Telegraph  and many others. 

On Wednesday evening  James Cameron tweeted that this was the week –  in his Terminator movies – that autonomous armed machines rose up against humanity.  This sparked a lot of interest in autonomous weapon systems, including an article on drones on the BBC website

And then on Thursday night Barack Obama approved the use of armed Predator drone in Libya. In between media interviews I’ve been watching this blog’s stats go through the roof!

Meanwhile drone strikes continue in Afghanistan and Pakistan.  The MoD confirmed last week that UK drone have now conducted 167 armed drone strikes in Afghanistan.  US strikes continue intermittently in Pakistan with the latest strike killing 25 people, including  according to some reports five children.  This strike comes almost a month after a strike on March 17th killed 44 people including many civilians.   Relations between the US and Pakistan have reached crisis point over the drone strikes with many Pakistanis calling for the air force to shoot the drones down.   Coincidently, Col Grant Webb, Commander of the US Joint UAS Centre of Excellence announced that Operation Blue Knight, this year,  a regular drone training exercise, will this year feature F15’s and F16s trying to shoot down drones

UK Drone Strikes in Afghanistan

However, a story that has passed almost unnoticed in the mass of recent drones stories is the killing of two US servicemen by a US drone in Afghanistan in the first week of April.   A key argument of those who support drone strikes is that they do not make mistakes as the drone’s incredibly accurate cameras can show the target in great detail allowing strikes to be made with pin-point accuracy.  US spokespeople, when denying civilians have been killed, use this argument  time and again. The killing of two US soldiers by a US drone  in a so-called ‘friendly fire’ incident  shows that drones are far from infallible.

Talking of fallibility, another Reaper drone has crashed – this time on a training mission in New Mexico.  Time to update the drone crash database.

I’ll be back…..

New MoD document recognises legal, ethical and moral issues raised by use of armed drones

The UK Approach to Unmanned Aircraft Systems (click image to open PDF)

The UK Ministry of Defence has published a new document, to “inform and prompt wider debate” on military unmanned aerial vehicles (UAV), commonly known as drones.   The UK Approach to Unmanned Aircraft Systems is a Joint Doctrine Note (JDN) that examines technological and scientific issues related to current and future use of armed and unarmed drones. It  also sets out, for the first time, what it sees as the legal, moral and ethical issues that arise from using such systems.

Arguing that unmanned aircraft now hold a central role in modern warfare, it states “there is a real possibility that, after many false starts and broken promises, a technological tipping point is approaching that may well deliver a genuine revolution in military affairs.”

The publication of this report is very much to be welcomed, in particular its recognition of the serious moral, ethical and legal issues at stake  with the growing use of  unmanned drones and autonomous systems.   At just over 100 pages long the document covers a lot of ground but in this initial review I want to focus on three particular issues.   

Framing the Debate:  On not calling a spade, a spade

As has been the case for some time, when talking about unmanned drones, the use of the term ‘drone’ is an absolute ‘no no’ within the military.  While ‘unmanned aircraft’ or ‘unmanned aerial system’ is seen as acceptable, the term ‘remotely piloted aircraft’ is suggested as appropriate, the document says,  “when talking to the media.”   While it may well be true that this is in part to avoid confusion, it is to counter, one of the key weaknesses to the future development of UAVs identified by the document: the  “public perception issue”.   By avoiding the term ‘drone’ it is perhaps hoped that negative perception of the ‘killer drones’ variety can simply be avoided. 

The document also argues strongly against the idea that any drones currently under development could or should be called ‘autonomous ‘suggesting instead that they are in fact merely ‘automated’.  “Those [drones] that carry out their entire mission from take-off to landing without human intervention may be said to be fully automated” it argues.  Taking what could be said to be a maxim approach to the issue of autonomy,  the document argues that machines or systems can only truly be called autonomous when they are self aware or their understanding is indistinguishable from humans;  

“Autonomous systems will, in effect, be self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human” says the document.  

This would be a substantially different definition of ‘autonomy’ than is being used by many scientists and companies involved in developing autonomous systems as the document itself recognizes:  “Companies may describe their systems to be autonomous even though they would not be considered as such under the military definition.”

I imagine the reason for taking this position, is again in part for public perception reasons.  However  there are other key reasons for not wanting to label drones as autonomous as the document clearly recognizes: “The distinction between autonomous and automated is important as there are moral, ethical and legal implications regarding the use of autonomous unmanned aircraft.” 

While this new document is an important step forward by the MoD in acknowledging that there are legal, ethical and moral issues associated with the growing use of drones, at the same time the document wants to frame the debate and keep it on its own terms.  

Humans:  In, on, or out of the loop?

Legally humans are required to make the final decision with regard to firing of weapons from drones.  This is known as humans being ‘in the loop’.  However we know that industry is developing systems that will mean humans moving from being ‘in the loop’ to  being ‘on the loop’,  that is monitoring several armed drones at the same time. The new document notes this change and acknowledges that the growing development of autonomous (sorry, automated) drones, means that the legal requirement is “being eroded”.  

At one point the document tries to clearly states  that the MoD has no plans to enable drones to independently make decisions about firing it weapons:

“It should be noted that the MOD currently has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective”

But the issue of drones deciding themselves whether to launch weapons is not completely ruled out as this key passage, shows:    

“A human-authorised [drone] attack would be no different to that by a manned aircraft and would be fully compliant with the LOAC [Laws of Armed Conflict], provided the human believed that, based on the information available, the attack met LOAC requirements and extant ROE [Rules of Engagement].  From this position, it would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority. Provided it could be shown that the controlling system appropriately assessed the LOAC principles (military necessity; humanity; distinction and proportionality) and that ROE were satisfied, this would be entirely legal.

 In practice, such operations would present a considerable technological challenge and the software testing and certification for such a system would be extremely expensive as well as time consuming. Meeting the requirement for proportionality and distinction would be particularly problematic, as both of these areas are likely to contain elements of ambiguity requiring sophisticated judgement. Such problems are particularly difficult for a machine to solve and would likely require some form of artificial intelligence to be successful. Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years, with some outliers far later than this…  Until such a capability is achieved it is likely that, apart from some niche tasks, human intervention will continue to be required at key stages of an unmanned aircraft’s mission if it involves weapon-delivery.”

There are very serious legal not to mention ethical and moral issues raised by the prospect of unmanned systems deciding themselves whether to launch their weapons.  The MoD’s  assurances that they are not currently, as they put it developing these systems, while at the same time blurring the distinction between ‘autonomous’ and ‘automated’ is unhelpful.  This together with the fact that exploration  into the “the technological challenge” to achieve such a capability appears to be continuing is extremely worrying.   It would be helpful if the MoD  simply, clearly and unambiguously ruled out the idea of humans being ‘out of the loop’ when it comes to launching weapons. 

Will Remote War means More War?

We have argued for some time that the geographical and psychological  distance between the drone operator launching weapons and the point of attack may mean in practice that the threshold for launching weapons may be reduced.  In addition, the fact that remote war is undertaken at no risk to your own forces also may mean that there is a greater temptation to undertake armed attacks and assassinations.  The authors of the document raises this issue too in a section on ethical and moral issues:

“One of the contributory factors in controlling and limiting aggressive policy is the risk to one’s own forces. It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, or at least keeping it at a distance, that we do not risk losing our controlling humanity and make war more likely.”

However, the document also argues that this negative must be “tempered” by the fact that “the use of unmanned aircraft prevents the potential loss of aircrew lives and is thus in itself morally justified.”

 The authors argue that “what is needed is a clear understanding of the issues involved so that informed decisions can be made.”  We would, of course support this point and would argue that in a democratic society it should not be a matter for military or legal experts to make a these important  decision but there needs to be a genuine public debate.

Further comments

I will be reflecting further on this interesting and fascinating insight in the MoD’s thinking on drones over the next few weeks.  I’d be really interested in your comments too!

UK Reapers notch up 20,000 flying hours

The UK MoD have announced today that UK  Reaper drones have now notched up over 20,000 hours flying over Afghanistan since they were first deployed in October 2007. 

In the self-congratulatory announcement Air Vice-Marshal Phil Osborn makes the point that by flying over Afghanistan,  the Reaper drone  is “saving lives” and “making a real difference.”  He is, of course, lest there be any confusion, referring to the lives of British troops, not ordinary Afghans : 

“The real-time, day and night video coverage of the battle space, combined with the extensive use of onboard radar, provides a unique, cost effective and sustained capability that enhances the safety of troops on the ground. This cutting-edge remotely-piloted aircraft provides an impressive range of capabilities that are saving lives and making a real difference to the troops in Afghanistan.”

Not a word, of course, about the casualties of Britain’s drone wars, whether civilian or ‘militants’.  The only indication that there have been victims came from David Cameron’s boast to journalists in December that more than 124 insurgents had been killed in British drone attacks.  All questions about these 124 ‘insurgents’  – and whether there have been any other civilian casualties – have simply gone answered…..  

Meanwhile, the RAF’s Project Daedalus, to investigate if non-pilots could be trained to fly unmanned aerial systems as well as fully trained pilots has been ‘completed’.   According to the RAF, the programme

“has successfully demonstrated that selection and training can generate remote pilots who, despite undergoing a different sort of training, are as highly trained and equally skilled as traditional pilots in that field.”

This could have far-reaching implications given that both in the US and the UK current drone pilots are only drawn from those with previous fast-jet flying experience.    An interview, from April 2010, with those involved in the training can be read here.

Must mention, finally,  Steve Bell’s wonderful cartoon comment on David Cameron’s trip this week to Pakistan:

@Steve Bell

Reaper and Predator drone manufacturer opens UK office

Neal Blue, CEO and owner of General Atomics with a Predator drone

The makers of the Predator and Reaper drone, General Atomics, have announced that they have opened an office in London  (although omit to say exactly where it is!).   Their  brief press notice says:

General Atomics Aeronautical Systems, Inc. (GA‑ASI), a leading manufacturer of unmanned aircraft systems (UAS), tactical reconnaissance radars, and surveillance systems, today announced that General Atomics Aeronautical Systems UK Ltd (GA-UK), an affiliated entity, has been established with an office in London.  The office will be managed by Dr. Jonny King. 

 “We are pleased that the London office will provide dedicated support for the Ministry of Defence’s [MoD’s] Remotely Piloted Air Systems [RPAS] requirements,” said Neal Blue, Chairman and CEO of GA-ASI.

GA-ASI has delivered a total of six aircraft to the MoD since the first UK Predator® B/ MQ-9 Reaper UAS was deployed to Afghanistan in October 2007, with the fleet expected to nearly double in size over the next few years.  The aircraft have logged over 17,000 flight hours to date in support of UK forces on the ground.

A brief web search on Jonny King reveals that he has previously worked at QinetiQ on Autonomous Systems and Intelligent Vehicles and most recently at Cobham where he was responsible for developing Cobham’s Unmanned Systems business.  Cobham has been a longtime partner with General Atomics on the Reaper programme

Last summer General Atomics announced that it was looking for a UK partner for its bid on the future replacement for the Reaper UAV, for which  General Atomics are bidding their Avenger drone  in competition with BAE’s Mantis drone.