MoD unit seeks hunter-killer tech for UK drones

CDE presentation on ISTAR requirements (click to download)

The Centre for Defence Enterprise (CDE) is a bit like the Ministry of Defence’s very own Dragon Den.  It bills itself as “the  first point of contact for anyone with a disruptive technology, new process or innovation that has a potential defence application”.   In other words if any boffin  / entrepeneur / small company out there thinks that have an idea or design for a new weapons system for example they get steered towards the CDE and if its good enough, they get funding.

Over the past couple of years the CDE has begun to host events to try to nudge inventors, academics and small companies to undertake research into particular technologies or areas with specific aims in mind.   Earlier this month the CDE held a day long seminar at Cardiff University entitled  ‘The Military Challenge for Science and Technology’.   The programme for the day stated the event “was split with a morning session looking, in general, at the opportunities for new science and technology to impact on military capability and an afternoon session presenting two current calls for research proposals in the areas of ISTAR (Intelligence, Surveillance, Target Acquisition and Reconnaissance) and Sensors.”

A presentation from the morning session – giving an over view of the work is available here

The afternoon session was much more focused and of particular interest was the call for equipment and sensors that can undertake “automatic (assisted) target recognition of vehicles and people” (slide 22) and the “assisted detection and recognition of people and gestures in urban scenarios” (slide 25).  In a scenario envisaged earlier in the presentation, the companies are told to assume “High Value Target list agreed and maintained” and that the “TOI [Target of Interest] trigger is of sufficient priority to enable priority asset tracking” (slide 6). Click the above images to download the full presentation.

The individual recognition sensors that the MoD are interested in developing should be able to be mounted on mobile platforms (presumably such as drones)  need to be able to combine “face, gait and shape features”  and “identify individuals or reacquire targets from their known signature.” Bizarrely the presentation also seem to suggest  that “X-box ‘kinect’ sensors” may be useful for this work.   Video games warfare indeed!    The MoD’s deadline for responses from industry is very short –  closing date for proposals/bids to fill this need is September 27 with a demonstration event set for February 2012.

See also Nick Hopkins Guardian article ‘Updated drones to pinpoint targets sought by MoD’

Debating drone use as pursuit of autonomous drones continues

Earlier this month Foreign Policy magazine website ran an article called ‘Don’t Fear the Reaper’ arguing that those opposing drones are “misleading the public” and “distracting attention away from some more important and bigger issues.”  Strong accusations indeed.  While far from being the first article that has been written arguing that drones are no worse than other weaponry (or even, a good thing), this was a seemingly well argued piece coming from two academics involved in the field of international relations and as such deserved something of a response.

Luckily Brian Terrell from Voices for Creative Nonviolence has taken up the challenge and responded with “Four Realities About Drones: War of the Killer Robots.”   As a part-time student myself,  I appreciate the academic temptation to want to disregard the messy reality of real life and simply focus on the underlying ethical issues. However drones are the concrete embodiment of those underlying ethical issues – and are being used each day to tear flesh and families apart. As such it’s our human duty to engage with both the ethical issues and the day to day reality.  Academics castigating and sneering at those who engage in public education and action on the issue is not helpful.

Meanwhile….   The ‘insatiable’ demand for drones will inevitably lead to greater autonomy according to a senior US military official this week.  The Air Force Times reported Colonel James Gear, director of the Air Force’s unmanned aircraft task force, as saying that ‘unmanned aircraft should be almost entirely automated so the humans can be productively engaged in tasks the machines aren’t good at’. We have reported before that research work is being undertaken in anticipation of this and a short report this week shows that researchers from the USAF and Wright-Patterson University in Ohio are developing systems to allow a single human operator to oversee multiple UAV’s at once often seen and the next stage towards autonomy. The ‘never ending’ and ‘sky rocketing’ demand for drones is also pushing the need for more satellite bandwidth to cope with the communications and intelligence from UAVs. According to Boeing’s Jim Simpson, vice president of business development for the space and intelligence systems sector “UAVs are standing down because there is not enough communications to utilize them.”

 

The push to autonomy is nicely satirised in this spoof presentation of ‘The Ethical Governor’, a fictional key component in autonomous drones, by animator John Butler.   You can read more about it here.

Drone Laws

An article by two senior MoD scientists in an obscure US military journal suggests that while the UK military says it has no intention of developing a fully autonomous armed drone, background research work, which would  enable such a system, continues.

Just a few weeks ago a report released by the UK Ministry of Defence (The UK Approach to Unmanned Aerial Systems) declared that while it was not currently (as it put it) developing an autonomous armed drone it was “looking to increase levels of automation where this will make systems more effective.”   

Using almost exactly the same wording, Tony Gillespie and Robin West, who both work at the Defence Science and Technology Laboratory, wrote in an article for The International C2 Journal that “The UK Ministry of Defence (MOD) has no intention to develop systems with no human intervention in the [Command and Control] chain, but there is the desire to raise the autonomy level of its Unmanned Aerial Systems.”                                    

The 20-page article goes on to outline four key underlying legal principles of the Laws of Armed Conflict (Necessity, Humanity, Distinction and Proportionality) before suggesting that “the problem… is to identify which of the authorized entities in the UAS Command and Control chain can become non-human and still meet the [legal] requirements.”  The authors make the argument that “humans are well adapted to make subjective, qualitative decisions whereas machines make good quantitative ones….” therefore “the next step in the systems engineering process requires an approach that turns qualitative criteria into quantitative ones.”       

To be scrupulously fair to the authors they do make clear that a fully autonomous weaponized system may never be acceptable and indeed do not suggest that such a weapon should be built. However the background work on “raising the level of autonomy” of drones continues in military institutions around the world (see for example this 2008 study into military robotics for the US Department of the Navy) and in corporate research laboratories (for example see details of BAE Systems autonomous programme here).

The legality of putting an autonomous armed drone system into service  would, to say the least, be seriously questioned.  However it is not just future drones that are raising legal questions as we have reported many times. The various articles here by Chris Rogers provide a good introduction and overview of the legal issues

We are pleased therefore to see a number of initiatives beginning to take shape that may well lead to concerted legal challenges – and eventually perhaps stronger international laws – on the use of armed unmanned systems.    These include the work of human rights group Reprieve working with local lawyers in Pakistan on potential legal action and several international conferences on the issue, including one jointly organised by the International Institute of Humanitarian Law and the International Committee of the Red Cross.

Recent protest vigil at UAV Engines, Shenstone

Of course it is not just in the courts or the conference room that pressure is needed to ensure that illegal drone strikes cease. Public action such as the recent vigil at UAV Engines in Shenstone (photo) and the protest at Hancock Air force base in New York State from where drones are controlled  – in which almost 40 people were arrested – are also very much needed to bring about an end to current  and future drone wars.

New MoD document recognises legal, ethical and moral issues raised by use of armed drones

The UK Approach to Unmanned Aircraft Systems (click image to open PDF)

The UK Ministry of Defence has published a new document, to “inform and prompt wider debate” on military unmanned aerial vehicles (UAV), commonly known as drones.   The UK Approach to Unmanned Aircraft Systems is a Joint Doctrine Note (JDN) that examines technological and scientific issues related to current and future use of armed and unarmed drones. It  also sets out, for the first time, what it sees as the legal, moral and ethical issues that arise from using such systems.

Arguing that unmanned aircraft now hold a central role in modern warfare, it states “there is a real possibility that, after many false starts and broken promises, a technological tipping point is approaching that may well deliver a genuine revolution in military affairs.”

The publication of this report is very much to be welcomed, in particular its recognition of the serious moral, ethical and legal issues at stake  with the growing use of  unmanned drones and autonomous systems.   At just over 100 pages long the document covers a lot of ground but in this initial review I want to focus on three particular issues.   

Framing the Debate:  On not calling a spade, a spade

As has been the case for some time, when talking about unmanned drones, the use of the term ‘drone’ is an absolute ‘no no’ within the military.  While ‘unmanned aircraft’ or ‘unmanned aerial system’ is seen as acceptable, the term ‘remotely piloted aircraft’ is suggested as appropriate, the document says,  “when talking to the media.”   While it may well be true that this is in part to avoid confusion, it is to counter, one of the key weaknesses to the future development of UAVs identified by the document: the  “public perception issue”.   By avoiding the term ‘drone’ it is perhaps hoped that negative perception of the ‘killer drones’ variety can simply be avoided. 

The document also argues strongly against the idea that any drones currently under development could or should be called ‘autonomous ‘suggesting instead that they are in fact merely ‘automated’.  “Those [drones] that carry out their entire mission from take-off to landing without human intervention may be said to be fully automated” it argues.  Taking what could be said to be a maxim approach to the issue of autonomy,  the document argues that machines or systems can only truly be called autonomous when they are self aware or their understanding is indistinguishable from humans;  

“Autonomous systems will, in effect, be self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human” says the document.  

This would be a substantially different definition of ‘autonomy’ than is being used by many scientists and companies involved in developing autonomous systems as the document itself recognizes:  “Companies may describe their systems to be autonomous even though they would not be considered as such under the military definition.”

I imagine the reason for taking this position, is again in part for public perception reasons.  However  there are other key reasons for not wanting to label drones as autonomous as the document clearly recognizes: “The distinction between autonomous and automated is important as there are moral, ethical and legal implications regarding the use of autonomous unmanned aircraft.” 

While this new document is an important step forward by the MoD in acknowledging that there are legal, ethical and moral issues associated with the growing use of drones, at the same time the document wants to frame the debate and keep it on its own terms.  

Humans:  In, on, or out of the loop?

Legally humans are required to make the final decision with regard to firing of weapons from drones.  This is known as humans being ‘in the loop’.  However we know that industry is developing systems that will mean humans moving from being ‘in the loop’ to  being ‘on the loop’,  that is monitoring several armed drones at the same time. The new document notes this change and acknowledges that the growing development of autonomous (sorry, automated) drones, means that the legal requirement is “being eroded”.  

At one point the document tries to clearly states  that the MoD has no plans to enable drones to independently make decisions about firing it weapons:

“It should be noted that the MOD currently has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective”

But the issue of drones deciding themselves whether to launch weapons is not completely ruled out as this key passage, shows:    

“A human-authorised [drone] attack would be no different to that by a manned aircraft and would be fully compliant with the LOAC [Laws of Armed Conflict], provided the human believed that, based on the information available, the attack met LOAC requirements and extant ROE [Rules of Engagement].  From this position, it would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority. Provided it could be shown that the controlling system appropriately assessed the LOAC principles (military necessity; humanity; distinction and proportionality) and that ROE were satisfied, this would be entirely legal.

 In practice, such operations would present a considerable technological challenge and the software testing and certification for such a system would be extremely expensive as well as time consuming. Meeting the requirement for proportionality and distinction would be particularly problematic, as both of these areas are likely to contain elements of ambiguity requiring sophisticated judgement. Such problems are particularly difficult for a machine to solve and would likely require some form of artificial intelligence to be successful. Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years, with some outliers far later than this…  Until such a capability is achieved it is likely that, apart from some niche tasks, human intervention will continue to be required at key stages of an unmanned aircraft’s mission if it involves weapon-delivery.”

There are very serious legal not to mention ethical and moral issues raised by the prospect of unmanned systems deciding themselves whether to launch their weapons.  The MoD’s  assurances that they are not currently, as they put it developing these systems, while at the same time blurring the distinction between ‘autonomous’ and ‘automated’ is unhelpful.  This together with the fact that exploration  into the “the technological challenge” to achieve such a capability appears to be continuing is extremely worrying.   It would be helpful if the MoD  simply, clearly and unambiguously ruled out the idea of humans being ‘out of the loop’ when it comes to launching weapons. 

Will Remote War means More War?

We have argued for some time that the geographical and psychological  distance between the drone operator launching weapons and the point of attack may mean in practice that the threshold for launching weapons may be reduced.  In addition, the fact that remote war is undertaken at no risk to your own forces also may mean that there is a greater temptation to undertake armed attacks and assassinations.  The authors of the document raises this issue too in a section on ethical and moral issues:

“One of the contributory factors in controlling and limiting aggressive policy is the risk to one’s own forces. It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, or at least keeping it at a distance, that we do not risk losing our controlling humanity and make war more likely.”

However, the document also argues that this negative must be “tempered” by the fact that “the use of unmanned aircraft prevents the potential loss of aircrew lives and is thus in itself morally justified.”

 The authors argue that “what is needed is a clear understanding of the issues involved so that informed decisions can be made.”  We would, of course support this point and would argue that in a democratic society it should not be a matter for military or legal experts to make a these important  decision but there needs to be a genuine public debate.

Further comments

I will be reflecting further on this interesting and fascinating insight in the MoD’s thinking on drones over the next few weeks.  I’d be really interested in your comments too!

Drones Wars 2010: Proliferation, Pushing Autonomy and Prangs

Northrop Grumman's X-47B Drone: first flight due before end of year

As the year draws to a close against a background of increasing drone strikes in Afghanistan and Pakistan – between 50 and 60 people were killed in a number of separate drone strikes in Pakistan’s Khyber region this week  – the development of drone wars continues right around the world.   Three of the key themes that have emerged on this blog in the past six months –proliferation, the push to increased autonomy and drones crashes  –  are illustrated by developments this week.

Proliferation:  As was clear through the Wikileaks cables, every dictator and military leader has the latest drones on their Christmas wish list and many companies, are happy to oblige.    This week we learned that Israel are bidding to sell various drones to Chile  and India  while Peru has acquired micro drones from Israeli company Innocon.

The push to greater autonomy in drones has also been a regular theme this year and the year ends with Northrop Grumman announcing that its  X-47B combat drone is about to make its first flight.   The X-47B is designed to fly from an aircraft carrier and unlike current drones in service, the X-47b will fly mostly autonomously once aloft and, indeed its planned that it will be able to refuel in-flight autonomously. 

Prangs!  Finally the Drone Crash database illustrates how often drones, for all their supposed ‘smart technology’ sometimes simply fall out of the sky.  Two more examples have been added to the database.  Firstly two Australian drone crashes in Afghanistan have recently been revealed while a Mexican drone crashed in Texas last week.   The drone, an Israeli made Orbiter mini UAV, crossed the border into  the US and then crashed into the backyard of an El Paso resident.

And finally…  Channel Four has a very good report on 2010: – The Year of the Drone

SDSR, Drones and Autonomy

“There is extra money for unmanned aerial vehicles, and I think that anyone who has been to Afghanistan and seen the incredible work that is being done there knows that is a capability in which we should be investing” 

David Cameron’s statement on the Strategic Defence Review, 19th October 2010  (Hansard Column 817)

 

Prime Minister David Cameron’s vision of a “growing fleet” of drones, together with a commitment to extra money for drones in his statement on the Strategic Defence and Security Review (SDSR) this week will have delighted the drone industry.  While there is little detail at this stage, the financial commitment together with recent noises about greater cooperation on military projects with France will have boosted the idea of a new joint Anglo-French drone.  

As we reported in June, the MoD has confirmed that a study into the possibility of a joint French-Anglo drone was underway (MoD confirms joint UK/France study into future drone). This week French executives met with General Atomics after French Defence Minister Hervé Morin, told a government committee that his favoured way forward was to purchase Reaper drones in the short-term and to “build a European system” in the medium term.

A British-French military summit has been announced for November  and no doubt an announcement will be made then.

Meanwhile the push towards greater autonomy for drones continues.  At this week’s C4SIR conference in Washington, US Airforce Colonel JR Gear, the USAF Director of Remotely Piloted Aircraft Task Force urged us to embrace drone autonomy.  According to Henry Kenyon writing for Defense Systems, Gear said:  

Multi-aircraft control technology allows a pilot to manage several UAVs, while autonomous fight software can provide robot aircraft with the ability to carry out their missions with minimal supervision. The two capabilities could dramatically cut the number of personnel required to maintain an airborne presence in the region. Some 570 pilots are currently required to manage 50 UAV orbits. The new technology could cut this number to 150 pilots.

Kenyon’s excellent article goes on to look at the recent US military document, Technology Horizons, which examines key science and technology needs for the USAF over the next 20 years.  On the issue of drones  

 By 2030 technology will have reached the point that humans will be the weakest part of the system. Humans and machines will have to work more closely through new types of interfaces and by directly augmenting human performance. This could include drugs or implants to improve memory, alertness and cognition. The service is even considering the use of human brain waves or genetics to control and manage systems.