Tribunal upholds MoD refusal to disclose details of UK Reaper drone missions outside of Op Shader

Click to read Decision Notice

Fifteen months after hearing our appeal, an Information Tribunal handed down its decision this week rejecting our arguments that basic details about the deployment of armed Reaper drones outside of Operation Shader (Iraq/Syria) by the UK needed to be released to enable public and parliamentary oversight over such deployments.

Both Clive Lewis MP and Baroness Vivienne Stern, Vice-Chair of the All Party Parliamentary Group (APPG) on Drones and Modern Conflict had submitted statements to the Tribunal supporting our appeal.  Clive Lewis argued that  the refusal to answer these questions about the deployment of Reaper is “a serious backward step in terms of transparency and accountability.”   Baroness Stern stated:

“Despite repeated attempts by myself and colleagues to attain even the most basic information about the UK’s drone deployments, policy, and commitments, Parliament has not been provided with the accurate and timely information needed to meaningfully carry out its constitutional scrutiny role. Whilst certain details must be kept secret in order to ensure operational and national security, the current trend of withholding information about the use of drones purely because it is seen as an “intelligence” asset, as well as withholding vital information on the UK’s growing military capabilities and commitments is deeply concerning and unjustified.”

While insisting that it was neither confirming nor denying the deployment, the MoD argued against the release of the information on three broad grounds . As the Decision Notice states:

“the MOD’s key concern about the release of the requested information was that it could lead an adversary to infer the absence or presence of UK personnel. In his [The MoD’s witness’] opinion were the locations to be released or inferred from a combination of requested data and already published material (the “mosaic effect”), there would be an elevated risk to any potential personnel in that location and an increased risk of hostile acts against them.”

A second concern was

“there would be an increased risk to any nation hosting the Reaper operations as an adversary may target a hostile act at the host nation rather than the UK which may be a more difficult target. Thereby undermining the UK’s relationship with that nation and undermining military operations conducted from that location.”

Finally, and most concerning from a scrutiny and oversight point of view the MoD argued (again quoting Decision Notice)

“The effectiveness of operations conducted using Reaper outside Operation Shader in future depend, in part, on a greater degree of ambiguity as to the employment of Reaper in order to be successful. It is important to retain a degree of ambiguity regarding the full extent of Reaper operations now in order to maintain this flexibility in the future. “

Drone Wars argued strongly that the information requested –  a single figure of the number of sorties undertaken outside of Operation Shader and their broad, geographic location (i.e. ‘The Middle East’) – was not capable of causing the prejudice alleged.  We also pointed out to the Tribunal that the MoD has previously released the number of sorties undertaken outside of Operation Shader (In response to our questions about the targeted killing of Naweed Hussain in 2018) without any of the prejudice or harm suggested, but that seems to have been ignored by the tribunal.  Read more

Cyborg Dawn?  Human-machine fusion and the future of warfighting

Click to open report

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from in a science fiction film.  Yet research projects investigating all these possibilities are under way in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

In order to help in understanding the possibilities and hazards posed by human enhancement technology, Drone Wars UK is publishing ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation.

Human enhancement –  a medical or biological intervention to the body designed to improve performance, appearance, or capability beyond what is necessary to achieve, sustain or restore health – may lead to fundamentally new concepts of warfare and can be expected to play a role in enabling the increased use of remotely operated and uncrewed systems in war.

Although military planners are eager to create ‘super soldiers’, the idea of artificially modifying humans to give them capabilities beyond their natural abilities presents significant moral, legal, and health risks.  The field of human augmentation is fraught with danger, and without stringent regulation, neurotechnologies and genetic modification will lead us to an increasingly dangerous future where technology encourages and accelerates warfare.  The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force.  There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners intend to develop.  Read more

The UK and the Ukraine War: Drones vs Diplomacy

Custom-built British ‘suicide-drone’ reportedly bound for Ukraine.     Pic: QinetiQ

The UK is to supply Ukraine with “hundreds of new long-range attack drones” a government spokesperson told the media on Monday as the Prime Minister Rishi Sunak welcomed President Volodymyr Zelenskiy to Britain for a brief visit.

“Today the prime minister will confirm the further UK provision of hundreds of air defence missiles and further unmanned aerial systems including hundreds of new long-range attack drones with a range of over 200km. These will all be delivered over the coming months as Ukraine prepares to intensify its resistance to the ongoing Russian invasion.”

It is not at all clear what theses ‘long range attack drones’ are, although there has been some reports of the UK funding the development of a ‘suicide-drone’ to supply to Ukraine.

This latest news comes on top of the announcement in the last few weeks that the UK is supplying Storm Shadow cruise missiles to Ukraine following the export of UK Challenger 2 tanks.

Some will no doubt welcome the supply of attack drones and cruise missiles to Ukraine as a counter to Russia’s military aggression. It goes without saying that Russia’s invasion of Ukraine and continuing use of lethal force is unlawful and must be resisted.   However, there are real questions to be asked now about how such a strategy of supplying evermore lethal military hardware risks expanding rather than ending this war. It is becoming increasingly easier to see the UK and other NATO countries being drawn more directly into an armed conflict with Russia.  Any such escalation would be disastrous for the people of Ukraine and the wider region as well as seriously risking a catastrophic nuclear event.

Rather than escalating the conflict by supplying ever more lethal arms, the UK should be urging negotiations to end the war as it is inevitable that this will have to happen at some point.  While some western military analysts urge that the war should be prolonged in order to weaken Russia in the long term, Ukraine and its people suffer.

Negotiations are of course a matter for the Ukrainian people, but it should be remembered that a settlement  was seemingly very close last March with a Turkish-backed plan for Russian forces to withdraw to their pre-24 February positions without Ukraine giving up its claim to any of its territory.  Unfortunately the moment passed (with suggestions that the then British PM Boris Johnson personally lobbied Zelenskiy to reject the plan (for more on this see  Ukraine One Year On: Time to Negotiate Peace).

While it is easy for the current PM to grab a few headlines and play to the crowd by supplying lethal attack drones to Ukraine, the harder but more rewarding long-term work of diplomacy in order to end this awful war is being neglected.

MoD’s AI ethics panel expert tells Lord’s Committee: ‘More should be done’

L-R: Alexander Blanchard, Digital Ethics Research Fellow, Alan Turing Institute; Mariarosaria Taddeo, Associate Professor, Oxford Internet Institute; Verity Coyle, Senior Campaigner/Advisor, Amnesty UK

Almost a year ago the Ministry of Defence (MoD) launched its Defence Artificial Intelligence Strategy to explain how it would adopt and exploit artificial intelligence (AI) “at pace and scale”.  Among other things, the strategy set out the aspiration for MoD to be “trusted – by the public, our partners and our people, for the safety and reliability of our AI systems, and our clear commitment to lawful and ethical AI use in line with our core values”.

An accompanying policy document, with the title ‘Ambitious, Safe, Responsible‘ explained how MoD intended to win trust for its AI systems.  The document put forward five Ethical Principles for AI in Defence, and announced that MoD had convened an AI Ethics Advisory Panel: a group of experts from academia, industry, civil society and from within MoD itself to advise on the development of policy on the safe and responsible development and use of AI.

The AI Ethics Advisory Panel and its role was one of the topics of interest to the House of Lords Select Committee on AI in Weapon Systems when it met for the fourth time recently to take evidence on the ethical and human rights issues posed by the development of autonomous weapons and their use in warfare.  Witnesses giving evidence at the session were Verity Coyle from Amnesty International, Professor Mariarosaria Taddeo from the Oxford Internet Institute, and Dr Alexander Blanchard from the Alan Turing Institute.  As Professor Taddeo is a member of the MoD’s AI Ethics Advisory Panel, former Defence Secretary Lord Browne took the opportunity to ask her to share her experiences of the panel.

Lord Browne:

“It is the membership of the panel that really interests me. This is a hybrid panel. It has a number of people whose interests are very obvious; it has academics, where the interests are not nearly as clearly obvious, if they have them; and it has some people in industry, who may well have interests.

What are the qualifications to be a member and what is the process you went through to become a member? At any time were you asked about interests? For example, are there academics on this panel who have been funded by the Ministry of Defence or government to do research? That would be of interest to people. Where is the transparency? This panel has met three times by June 2022. I have no idea how often it has met, because I cannot find anything about what was said at it or who said it. I am less interested in who said it, but it would appear there is no transparency at all about what ethical advice was actually shared.

As an ethicist, are you comfortable about being in a panel of this nature, which is such an important element of the judgment we will have to take as to the tolerance of our society, in light of our values, for the deployment of these weapons systems? Should it be done in this hybrid, complex way, without any transparency as to who is giving the advice, what the advice is and what effect it has had on what comes out in this policy document?”

Lord Browne’s questions neatly capture some of the concerns which Drone Wars shares about the MoD’s approach to AI ethics.  Professor Taddeo set out the benefits of the panel as she saw them in her reply, but clearly shared many of Lord Browne’s concerns.  “These are very good questions, which the MoD should address”, she answered.  She agreed that “there can be improvement in terms of transparency of the processes, notes and records”, and said that “this is mentioned whenever we meet”.  She also raised questions about the effectiveness of the panel, telling the Lords that: “This discussion is one hour and a half, and there are a lot of experts in the room who are all prepared, but we did not even scratch the surface of many issues that we have to address”.  The panel is an advisory panel, and “so far, all we have done is to be provided with a draft of, for example, the principles or the document and to give feedback”.

If the only role the MoD’s AI Ethics Advisory Panel has played was to advise on principles for inclusion in the Defence Artificial Intelligence Strategy, then an obvious question is what is needed instead to ensure that MoD develops and uses AI in a safe and responsible way?  Professor Taddeo felt that the current panel “is a good effort in the right direction”, but “would hope it is not deemed sufficient to ensure ethical behaviour of defence organisations; more should be done”.    Read more

The arms race towards autonomous weapons – industry acknowledge concerns

(L to R) Courtney Bowman, Palantir Technologies UK; Dr Kenneth Payne, Professor of Strategy, King’s College London; James Black, Assistant Director of the Defence and Security Research Group, RAND Europe; Keith Dear, Director of Artificial Intelligence Innovation, Fujitsu;

The third evidence session for the House of Lords Select Committee on Artificial Intelligence (AI) in weapon systems heard views on the development and impact of autonomous weapons from the perspective of the military technology sector.

Witnesses giving evidence at the session were former RAF officer and Ministry of Defence (MoD) advisor Dr Keith Dear, now at Fujitsu Defence and Security; James Black of RAND Europe, Kenneth Payne of Kings College London and the MoD’s Defence Academy at Shrivenham, and Courtney Bowman of US tech company Palantir Technologies.  Palantir specialises in the development of AI technologies for surveillance and military purposes and has been described as a “pro-military arm of Silicon Valley”.  The company boasts that its software is “responsible for most of the targeting in Ukraine”, supporting the Ukrainian military in identifying tanks, artillery, and other targets in the war against Russia, and its Chief Technology Officer recently told the US Senate’s Armed Services Committee that: “If we want to effectively deter those that threaten US interests, we must spend at least 5% of our budget on capabilities that will terrify our adversaries”.

Not surprisingly, the witnesses tended to take a pro-industry view towards the development of AI and autonomous weapon systems, arguing that incentives, not regulation, were required to encourage technology companies to engage with concerns over ethics and impacts, and taking the fatalistic view that there is no way of stopping the AI juggernaut.  Nevertheless, towards the end of the session an interesting discussion on the hazards of arms racing took place, with the witnesses suggesting some positive steps which could help to reduce such a risk.

Arms racing and the undermining of global peace and security becomes a risk when qualitatively new technologies promising clear military advantages seem close at hand.  China, Russia, and the United States of America are already investing heavily in robotic and artificial intelligence technologies with the aim of exploiting their military potential.  Secrecy over military technology, and uncertainty and suspicion over the capabilities that a rival may have further accelerates arms races.

Competition between these rivals to gain an advantage over each other in autonomous technology and its military capabilities already meets the definition of an arms race –  ‘the participation of two or more nation-states in apparently competitive or interactive increases in quantity or quality of war material and/or persons under arms’ – and has the potential to escalate.  This competition has no absolute end goal: merely the relative goal of staying ahead of other competitors. Should one of these states, or another technologically advanced state, develop and deploy autonomous weapon systems in the field, it is very likely that others would follow suit. The ensuing race can be expected to be highly destabilising and dangerous. Read more

UK deployed Reaper drone to Sudan

Image of Port Sudan from UK Reaper video feed

The UK deployed an MQ-9 Reaper drone to Sudan as part of the military operation to support the evacuation of UK personnel the RAF has reported.

This is the first public acknowledgment of the deployment of UK Reaper drones outside of operations in Afghanistan, Iraq and Syria.  Drone Wars UK has been fighting a long FoI battle with the Ministry of Defence (MoD) about a possible further unacknowledged deployment. The drone provided intelligence gathering and surveillance cover for UK military operations during the evacuation, but it is not known if it was armed during the operation.

In a brief 3 May 2023 press release, the Royal Air Force (RAF) stated

“The RAF’s Reaper, an uncrewed aircraft that is designed for Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) missions has been supporting by providing up to date imagery of the port, airfield and ground environment. This includes highlighting unsafe and potentially dangerous areas to troops on the ground. It has also allowed for identification of buildings which would be suitable for temporary shelter, medical facilities, or locations to process passengers.

The Reaper was operated by XIII Squadron based at RAF Waddington. The Squadron Executive Officer said: “For XIII Squadron to operate the RAF Reaper over two separate continents on two different missions, having eyes on the ground in Africa and the Middle East simultaneously shows the flexibility of the aircraft and our people, a remarkable effort from all the Squadron”.”

The MoD reported that the military operation to safely evacuate UK personnel from Sudan had concluded on 3 May with the last flight departing from Port Sudan airport.  It reported that almost 2,500 personnel, including  1,200 non-UK nationals from 20 different countries had left via flight from  Wadi Saeedna airfield and Port Sudan airfield.  The UK Reaper drone had been tasked in particular with monitoring the UK’s processing station at Cora Hotel, Port Sudan where HMS Lancaster was based, as well as the two airfields from where evacuation flights too place.  Read more