Although some continue to insist that armed drones are in effect no different from other military aircraft, there seems to be increasing acceptance that the technology may lower the threshold for use of force. Stanley McChrystal, for example, former commander of US and NATO forces in Afghanistan, told a conference in London late last year that he believed the capabilities of drones could make them more palatable to military decision-makers and “lower the threshold” for lethal force, while a recently released MoD policy document Future Operating Environment 2035’ asserts that:
“increased use [of remote and automated systems] in combat and support functions will reduce the risk to military personnel and thereby potentially change the threshold for the use of force. Fewer casualties may lower political risk and any public reticence for a military response…”
An article by two senior MoD scientists in an obscure US military journal suggests that while the UK military says it has no intention of developing a fully autonomous armed drone, background research work, which would enable such a system, continues.
Just a few weeks ago a report released by the UK Ministry of Defence (The UK Approach to Unmanned Aerial Systems) declared that while it was not currently (as it put it) developing an autonomous armed drone it was “looking to increase levels of automation where this will make systems more effective.”
Using almost exactly the same wording, Tony Gillespie and Robin West, who both work at the Defence Science and Technology Laboratory, wrote in an article for The International C2 Journal that “The UK Ministry of Defence (MOD) has no intention to develop systems with no human intervention in the [Command and Control] chain, but there is the desire to raise the autonomy level of its Unmanned Aerial Systems.”
The 20-page article goes on to outline four key underlying legal principles of the Laws of Armed Conflict (Necessity, Humanity, Distinction and Proportionality) before suggesting that “the problem… is to identify which of the authorized entities in the UAS Command and Control chain can become non-human and still meet the [legal] requirements.” The authors make the argument that “humans are well adapted to make subjective, qualitative decisions whereas machines make good quantitative ones….” therefore “the next step in the systems engineering process requires an approach that turns qualitative criteria into quantitative ones.”
Of course it is not just in the courts or the conference room that pressure is needed to ensure that illegal drone strikes cease. Public action such as the recent vigil at UAV Engines in Shenstone (photo) and the protest at Hancock Air force base in New York State from where drones are controlled – in which almost 40 people were arrested – are also very much needed to bring about an end to current and future drone wars.
The UK Approach to Unmanned Aircraft Systems (click image to open PDF)
The UK Ministry of Defence has published a new document, to “inform and prompt wider debate” on military unmanned aerial vehicles (UAV), commonly known as drones. The UK Approach to Unmanned Aircraft Systems is a Joint Doctrine Note (JDN) that examines technological and scientific issues related to current and future use of armed and unarmed drones. It also sets out, for the first time, what it sees as the legal, moral and ethical issues that arise from using such systems.
Arguing that unmanned aircraft now hold a central role in modern warfare, it states “there is a real possibility that, after many false starts and broken promises, a technological tipping point is approaching that may well deliver a genuine revolution in military affairs.”
The publication of this report is very much to be welcomed, in particular its recognition of the serious moral, ethical and legal issues at stake with the growing use of unmanned drones and autonomous systems. At just over 100 pages long the document covers a lot of ground but in this initial review I want to focus on three particular issues.
Framing the Debate: On not calling a spade, a spade
As has been the case for some time, when talking about unmanned drones, the use of the term ‘drone’ is an absolute ‘no no’ within the military. While ‘unmanned aircraft’ or ‘unmanned aerial system’ is seen as acceptable, the term ‘remotely piloted aircraft’ is suggested as appropriate, the document says, “when talking to the media.” While it may well be true that this is in part to avoid confusion, it is to counter, one of the key weaknesses to the future development of UAVs identified by the document: the “public perception issue”. By avoiding the term ‘drone’ it is perhaps hoped that negative perception of the ‘killer drones’ variety can simply be avoided.
The document also argues strongly against the idea that any drones currently under development could or should be called ‘autonomous ‘suggesting instead that they are in fact merely ‘automated’. “Those [drones] that carry out their entire mission from take-off to landing without human intervention may be said to be fully automated” it argues. Taking what could be said to be a maxim approach to the issue of autonomy, the document argues that machines or systems can only truly be called autonomous when they are self aware or their understanding is indistinguishable from humans;
“Autonomous systems will, in effect, be self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human” says the document.
This would be a substantially different definition of ‘autonomy’ than is being used by many scientists and companies involved in developing autonomous systems as the document itself recognizes: “Companies may describe their systems to be autonomous even though they would not be considered as such under the military definition.”
I imagine the reason for taking this position, is again in part for public perception reasons. However there are other key reasons for not wanting to label drones as autonomous as the document clearly recognizes: “The distinction between autonomous and automated is important as there are moral, ethical and legal implications regarding the use of autonomous unmanned aircraft.”
While this new document is an important step forward by the MoD in acknowledging that there are legal, ethical and moral issues associated with the growing use of drones, at the same time the document wants to frame the debate and keep it on its own terms.
Humans: In, on, or out of the loop?
Legally humans are required to make the final decision with regard to firing of weapons from drones. This is known as humans being ‘in the loop’. However we know that industry is developing systems that will mean humans moving from being ‘in the loop’ to being ‘on the loop’, that is monitoring several armed drones at the same time. The new document notes this change and acknowledges that the growing development of autonomous (sorry, automated) drones, means that the legal requirement is “being eroded”.
At one point the document tries to clearly states that the MoD has no plans to enable drones to independently make decisions about firing it weapons:
“It should be noted that the MOD currently has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective”
But the issue of drones deciding themselves whether to launch weapons is not completely ruled out as this key passage, shows:
“A human-authorised [drone] attack would be no different to that by a manned aircraft and would be fully compliant with the LOAC [Laws of Armed Conflict], provided the human believed that, based on the information available, the attack met LOAC requirements and extant ROE [Rules of Engagement]. From this position, it would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority. Provided it could be shown that the controlling system appropriately assessed the LOAC principles (military necessity; humanity; distinction and proportionality) and that ROE were satisfied, this would be entirely legal.
In practice, such operations would present a considerable technological challenge and the software testing and certification for such a system would be extremely expensive as well as time consuming. Meeting the requirement for proportionality and distinction would be particularly problematic, as both of these areas are likely to contain elements of ambiguity requiring sophisticated judgement. Such problems are particularly difficult for a machine to solve and would likely require some form of artificial intelligence to be successful. Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years, with some outliers far later than this… Until such a capability is achieved it is likely that, apart from some niche tasks, human intervention will continue to be required at key stages of an unmanned aircraft’s mission if it involves weapon-delivery.”
There are very serious legal not to mention ethical and moral issues raised by the prospect of unmanned systems deciding themselves whether to launch their weapons. The MoD’s assurances that they are not currently, as they put it developing these systems, while at the same time blurring the distinction between ‘autonomous’ and ‘automated’ is unhelpful. This together with the fact that exploration into the “the technological challenge” to achieve such a capability appears to be continuing is extremely worrying. It would be helpful if the MoD simply, clearly and unambiguously ruled out the idea of humans being ‘out of the loop’ when it comes to launching weapons.
Will Remote War means More War?
We have argued for some time that the geographical and psychological distance between the drone operator launching weapons and the point of attack may mean in practice that the threshold for launching weapons may be reduced. In addition, the fact that remote war is undertaken at no risk to your own forces also may mean that there is a greater temptation to undertake armed attacks and assassinations. The authors of the document raises this issue too in a section on ethical and moral issues:
“One of the contributory factors in controlling and limiting aggressive policy is the risk to one’s own forces. It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, or at least keeping it at a distance, that we do not risk losing our controlling humanity and make war more likely.”
However, the document also argues that this negative must be “tempered” by the fact that “the use of unmanned aircraft prevents the potential loss of aircrew lives and is thus in itself morally justified.”
The authors argue that “what is needed is a clear understanding of the issues involved so that informed decisions can be made.” We would, of course support this point and would argue that in a democratic society it should not be a matter for military or legal experts to make a these important decision but there needs to be a genuine public debate.
Further comments
I will be reflecting further on this interesting and fascinating insight in the MoD’s thinking on drones over the next few weeks. I’d be really interested in your comments too!
The UK MoD has issued a pre ‘invitation to tender’ notice to the military industry for a new ‘nano’ drone to be used in Afghanistan. The document, revealed by Flight Global and the Guardian calls for a drone of no more 7oz which is capable of flying for between 20 and 40 minutes at a range of 1km. The contract, estimated to be worth between £10 and £20m, states “The drones should be available “off the shelf”, powered by a rotary wing, weigh less than 1.7kg,and able to operate in “typical conditions found in Afghanistan and the UK”.
Am I being a Luddite to be concerned about so much research energy, talent and resources being focused on drones? The military and the military industry will always, of course, talk about the benefits to civil society of military research and how ‘spin-offs’ create civilian jobs. Witness the Pentagon’s recent promotion of the idea that their drones can prevent genocide and no doubt drones can be useful in all sorts of civil applications, not least relief and rescue work. Yet the focus of drone research remains on enabling unmanned systems to be increasingly lethal and to support military operations.
Those involved in research on drones and unmanned systems need to consider how their work could be used in practice. We highly recommend Scientists for Global Responsibility and in particular their recent report Behind Closed Doors.
BBC Urdu has published new research into the undeclared war in Pakistan. They report that since January 2009 nearly 2,500 people have been killed in Pakistan as a result of US drones and Islamic militant attacks. They attribute 746 deaths to US drone strikes (30%) and 1,713 deaths (70%) to Islamic militant attacks.
What will hit the headlines though is the fact that in response a Taleban spokesman, Muhammed Umer, has said that “In the short term, yes, you can say it [drone strikes] has caused us some difficulties because of the martyrdoms and realignment of our ranks.” The Guardian have already reported the story as ‘Taliban says US drone attacks ‘temporarily’ hindering insurgency’. What Muhammed Umer goes on to say – and probably won’t get so much coverage – is that the drone strikes are also bringing new volunteers and recruits.
A unnamed ‘senior US official’ is also quoted in the report as saying that since Obama has taken office, 650 militants and 20 non-combatants have been killed by drone strikes. Tactfully the BBC say
“Research by the BBC’s Urdu service puts the number of those killed considerably higher, and says there have been many cases where there has been no positive identification of those killed at all”.
While the CIA, as usual refused to comment on their drone strikes, the same ‘unnamed senior US official’ said that drones are “the most precise weapons system in the history of warfare.” No doubt as precise as his casualty figures.
BBC Urdu has published new research into the undeclared war in Pakistan. They report that since January 2009 nearly 2,500 people have been killed in Pakistan as a result of US drones and Islamic militant attacks. They attribute 746 deaths to US drone strikes (30%) and 1,713 deaths (70%) to Islamic militant attacks.
What will hit the headlines though is the fact that in response a Taleban spokesman, Muhammed Umer, has said that “In the short term, yes, you can say it [drone strikes] has caused us some difficulties because of the martyrdoms and realignment of our ranks.” The Guardian have already reported the story as ‘Taliban says US drone attacks ‘temporarily’ hindering insurgency’. What Muhammed Umer goes on to say – and probably won’t get so much coverage – is that the drone strikes are also bringing new volunteers and recruits.
A unnamed ‘senior US official’ is also quoted in the report as saying that since Obama has taken office, 650 militants and 20 non-combatants have been killed by drone strikes. Tactfully the BBC say
“Research by the BBC’s Urdu service puts the number of those killed considerably higher, and says there have been many cases where there has been no positive identification of those killed at all”.
While the CIA, as usual refused to comment on their drone strikes, the same ‘unnamed senior US official’ said that drones are “the most precise weapons system in the history of warfare.” No doubt as precise as his casualty figures.