Outdragon revealed: UK secretly using US signal intelligence pod on drone operations

US MQ-9 Reaper drone carrying surveillance pod flying over a Polish base.  Credit: The Aviationist

Drone Wars UK can reveal that British armed Reaper drones have secretly been equipped with a US intelligence gathering capability called ‘Outdragon’ since around 2019.

Signal Intelligence (SIGINT) pods on US Reaper and Predator drones have been used to geolocate, track and kill individuals via signals from mobile phones, wireless routers or other communication devices using a variety of systems developed by intelligence agencies with codenames such as Airhandler and Gilgamesh.

In response to our FoI requests on the capability, the Ministry of Defence is refusing to confirm or deny any information other than the existence of a 2019 contract to integrate it with UK Reaper drones.

The existence of Outdragon and its use by the UK was confirmed by the (possibly mistaken) publication online of a series of MoD maintenance forms relating to the UK’s new MQ-9  ‘Protector’ drone.

Image from: Flying Log and Fatigue Data Sheet – MOD Form 725(Protector RG-1)(AV)

Documents released by Edward Snowden show that UK AIRHANDLER missions are developed and controlled from the UK’s Joint Service Signals Unit (JSSU) at RAF Digby, which is the nearest military base to the home of UK drone warfare, RAF Waddington.  A 2017 Intercept article, based on documents from Snowden, showed that US and British intelligence officials worked “side by side” at the base using AIRHANDLER with UK Reaper drones to gather data and develop near real-time intelligence for military and intelligence operations. Read more

Proceed with caution: Lords warn over development of military AI and killer robots

Click to open report

The use of artificial intelligence (AI) for the purposes of warfare through the development of AI-powered autonomous weapon systems – ‘killer robots’ –  “is one of the most controversial uses of AI today”, according to a new report by an influential House of Lords Committee.

The committee, which spent ten months investigating the application of AI to weapon systems and probing the UK government’s plans to develop military AI systems, concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.

Echoing concerns which Drone Wars UK has repeatedly raised, the Lords found that the stated aspiration of the Ministry of Defence (MoD) to be “ambitious, safe, responsible” in its use of AI “has not lived up to reality”, and that although MoD has claimed that transparency and challenge are central to its approach, “we have not found this yet to be the case”.

The cross-party House of Lords Committee on AI in Weapon Systems was set up in January 2023 at the suggestion of Liberal Democrat peer Lord Clement-Jones, and started taking evidence in March.    The committee heard oral evidence from 35 witnesses and received nearly 70 written evidence submissions, including evidence from Drone Wars UK.

The committee’s report is entitled ‘Proceed with Caution: Artificial Intelligence in Weapon Systems’ and ‘proceed with caution’ gives a fair summary of its recommendations.  The panel was drawn entirely from the core of the UK’s political and military establishment, and at times some members appeared to have difficulty in grasping the technical concepts underpinning the technologies behind autonomous weapons.  Under the circumstances the committee was never remotely likely to recommend that the government should not commit to the development of new weapons systems based on advanced technology, and in many respects its report provides a road-map setting out the committee’s views on how the MoD should go ahead in integrating AI into weapons systems and build public support for doing this.

Nevertheless, the committee has taken a sceptical view of the advantages claimed for autonomous weapons systems; has recognised the very real risks that they pose; and has proposed safeguards to mitigate the worst of the risks alongside a robust call for the government to “lead by example in international engagement on regulation of AWS [autonomous weapon systems]”.  Despite hearing from witnesses who argued that autonomous weapons “could be faster, more accurate and more resilient than existing weapon systems, could limit the casualties of war, and could protect “our people from harm by automating ‘dirty and dangerous’ tasks””, the committee was apparently unconvinced, concluding that “although a balance sheet of benefits and risks can be drawn, determining the net effect of AWS is difficult” – and that “this was acknowledged by the Ministry of Defence”.

Perhaps the  most important recommendation in the committee’s report relates to human control over autonomous weapons.  The committee found that:

The Government should ensure human control at all stages of an AWS’s lifecycle. Much of the concern about AWS is focused on systems in which the autonomy is enabled by AI technologies, with an AI system undertaking analysis on information obtained from sensors. But it is essential to have human control over the deployment of the system both to ensure human moral agency and legal compliance. This must be buttressed by our absolute national commitment to the requirements of international humanitarian law.

Read more

Cabinet Minister says UK flying surveillance missions over Gaza

*Important Update (4 December):  Despite stating on camera to two different news programmes that drones had been deployed, the Ministry of Defence has now told journalists that the Minister was incorrect – the UK has not deployed drones but has deployed other surveillance aircraft.   We have amended this blog post.

Late on Saturday 2 December, the Minister of Defence (MoD) issued a short online statement saying that the UK “will conduct surveillance flights over the eastern Mediterranean, including operating in air space over Israel and Gaza.”

On Sunday 3 December, Health Secretary Victoria Atkins, appearing as government spokesperson on Sky News, was asked about the flights and said “The Ministry of Defence has announced that it has sent some unmanned and unarmed, surveillance drones into the region to help look for hostages.”  Subsequently appearing on the BBC’s Laura Kuenssberg show, Atkins repeated that “unarmed and unmanned drones” were being sent to the region to help look for hostages.

Reporting of the MoD’s original statement by BBC and others included a line which stated that aircraft undertaking the missions “will include Shadow R1s, which the Royal Air Force use for intelligence gathering” but this now appears to have been removed from the online statement. The MoD’s statement did not mention drones.

While the  MoD says that “only information relating to hostage rescue will be passed to the relevant authorities” it is likely that electronic, signal and video intelligence of Gaza gathered by the aircraft will end up in the hands of the Israeli Defence Force. If so, many would consider the UK a participant in this horrific conflict which has killed thousands of innocent civilians and seen repeated violations of international law.

While some news organisations reported this as the first UK deployment of aircraft in the conflict, in fact as far back as 13 October the Prime Minister announced that UK surveillance aircraft were to be deployed “to support Israel.”

Since the beginning of 2023, the MoD has increased the level of secrecy surrounding the use of drones, refusing to provide details of UK Reaper operations arguing that it needs “ambiguity” about such deployments.

This latest episode  – where a Cabinet  Minister states on camera that UK drones have been deployed yet the Ministry of Defence refuses to acknowledge the deployment – is another ridiculous example of the secrecy surrounding UK drones.  [Note –  MoD said after this was published that the Minister is incorrect and other aircraft – not drones  – are  being deployed].

Rumour and misinformation about these UK operations are now bound to be rife and could well turn out to be damaging.  While the government will argue that it is undertaking these operations to assist with hostage rescue, it is easy to see how UK aircraft undertaking surveillance operations over Gaza could get further drawn into ‘supporting Israel’ in this horrific conflict.

The reality is that rather than ambiguity and confusion, we need proper parliamentary and public oversight to ensure we do not get drawn further into this conflict.  Rather than deploying more UK military assets, we should be working flat out for a ceasefire.

Online meeting 29th November, 7pm: ‘Cyborg Dawn? The military use of human augmentation’

 

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from a science fiction film. Yet research projects investigating all these possibilities are underway in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

Drone Wars UK and Scientists for Global Responsibility (SGR) are holding this online event to mark the publication of ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation, in order to increase understanding of the possibilities and hazards posed by human enhancement technology.

Speakers:

  • Peter Burt: Peter, a long-time researcher and campaigner on peace and human rights issues, authored the ‘Cyborg Dawn’ report. At Drone Wars UK he primarily works on issues relating to artificial intelligence and autonomy and their role in the future development of drones. Peter is also a Trustee the Nuclear Education Trust.
  • Ben Taylor-Green: Ben was awarded his DPhil from the University of Oxford in early 2023. His doctoral thesis, Empathic Predators: On the Affects and Optics of Brain-Computer Interface unmanned Aerial Vehicle Research is a pioneering philosophical anthropological inquiry concerning the dual use problem in international brain-computer interface (BCI) research.
  • Helen Close (Chair): Helen, a member of Drone Wars UK Steering Committee, is a Research Associate at the Omega Research Foundation, an NGO that researches the manufacture, trade in, and use of conventional arms and law enforcement equipment. She has worked at Omega since 2009 and works on number of issues including researching the manufacture of specific weapons of concern. Helen is a trustee of the Trust for Research and Education on the Arms Trade.

 

To attend this online event register here.

Click to view report

 

MoD AI projects list shows UK is developing technology that allows autonomous drones to kill

Omniscient graphic: ‘High Level Decision Making Module’ which integrates sensor information using deep probabilistic algorithms to detect, classify, and identify targets, threats, and their behaviours. Source: Roke

Artificial intelligence (AI) projects that could help to unleash new lethal weapons systems requiring little or no human control are being undertaken by the Ministry of Defence (MoD), according to information released to Drone Wars UK through a Freedom of Information Act request.

The development of lethal autonomous military systems – sometimes described as ‘killer robots’ – is deeply contentious and raises major ethical and human rights issues.  Last year the MoD published its Defence Artificial Intelligence Strategy setting out how it intends to adopt AI technology in its activities.

Drone Wars UK asked the MoD to provide it with the list of “over 200 AI-related R&D programmes” which the Strategy document stated the MoD was  working on.  Details of these programmes were not given in the Strategy itself, and MoD evaded questions from Parliamentarians who have asked for more details of its AI activities.

Although the Defence Artificial Intelligence Strategy claimed that over 200 programmes  are underway, only 73 are shown on the list provided to Drone Wars.  Release of the names of some projects were refused on defence, security and /or national security grounds.

However, MoD conceded that a list of “over 200” projects was never held when the strategy document was prepared in 2022, explaining that “our assessment of AI-related projects and programmes drew on a data collection exercise that was undertaken in 2019 that identified approximately 140 activities underway across the Front-Line Commands, Defence Science and Technology Laboratory (Dstl), Defence Equipment and Support (DE&S) and other organisations”.  The assessment that there were at least 200 programmes in total “was based on our understanding of the totality of activity underway across the department at the time”.

The list released includes programmes for all three armed forces, including a number of projects related to intelligence analysis systems and to drone swarms, as well as  more mundane  ‘back office’ projects.  It covers major multi-billion pound projects stretching over several decades, such as the Future Combat Air System (which includes the proposed new Tempest aircraft), new spy satellites, uncrewed submarines, and applications for using AI in everyday tasks such as predictive equipment maintenance, a repository of research reports, and a ‘virtual agent’ for administration.

However, the core of the list is a scheme to advance the development of AI-powered autonomous systems for use on the battlefield.  Many of these are based around the use of drones as a platform – usually aerial systems, but also maritime drones and autonomous ground vehicles.  A number of the projects on the list relate to the computerised identification of military targets by analysis of data from video feeds, satellite imagery, radar, and other sources.  Using artificial intelligence / machine learning for target identification is an important step towards the  development of autonomous weapon systems – ‘killer robots’ – which are able to operate without human control.  Even when they are under nominal human control, computer-directed weapons pose a high risk of civilian casualties for a number of reasons including the rapid speed at which they operate and difficulties in understanding the often un-transparent ways in which they make decisions.

The government claims it “does not possess fully autonomous weapons and has no intention of developing them”. However, the UK has consistently declined to support proposals put forward at the United Nations to ban them.

Among the initiatives on the list are the following projects.  All of them are focused on developing technologies that have potential for use in autonomous weapon systems.  Read more

Tribunal upholds MoD refusal to disclose details of UK Reaper drone missions outside of Op Shader

Click to read Decision Notice

Fifteen months after hearing our appeal, an Information Tribunal handed down its decision this week rejecting our arguments that basic details about the deployment of armed Reaper drones outside of Operation Shader (Iraq/Syria) by the UK needed to be released to enable public and parliamentary oversight over such deployments.

Both Clive Lewis MP and Baroness Vivienne Stern, Vice-Chair of the All Party Parliamentary Group (APPG) on Drones and Modern Conflict had submitted statements to the Tribunal supporting our appeal.  Clive Lewis argued that  the refusal to answer these questions about the deployment of Reaper is “a serious backward step in terms of transparency and accountability.”   Baroness Stern stated:

“Despite repeated attempts by myself and colleagues to attain even the most basic information about the UK’s drone deployments, policy, and commitments, Parliament has not been provided with the accurate and timely information needed to meaningfully carry out its constitutional scrutiny role. Whilst certain details must be kept secret in order to ensure operational and national security, the current trend of withholding information about the use of drones purely because it is seen as an “intelligence” asset, as well as withholding vital information on the UK’s growing military capabilities and commitments is deeply concerning and unjustified.”

While insisting that it was neither confirming nor denying the deployment, the MoD argued against the release of the information on three broad grounds . As the Decision Notice states:

“the MOD’s key concern about the release of the requested information was that it could lead an adversary to infer the absence or presence of UK personnel. In his [The MoD’s witness’] opinion were the locations to be released or inferred from a combination of requested data and already published material (the “mosaic effect”), there would be an elevated risk to any potential personnel in that location and an increased risk of hostile acts against them.”

A second concern was

“there would be an increased risk to any nation hosting the Reaper operations as an adversary may target a hostile act at the host nation rather than the UK which may be a more difficult target. Thereby undermining the UK’s relationship with that nation and undermining military operations conducted from that location.”

Finally, and most concerning from a scrutiny and oversight point of view the MoD argued (again quoting Decision Notice)

“The effectiveness of operations conducted using Reaper outside Operation Shader in future depend, in part, on a greater degree of ambiguity as to the employment of Reaper in order to be successful. It is important to retain a degree of ambiguity regarding the full extent of Reaper operations now in order to maintain this flexibility in the future. “

Drone Wars argued strongly that the information requested –  a single figure of the number of sorties undertaken outside of Operation Shader and their broad, geographic location (i.e. ‘The Middle East’) – was not capable of causing the prejudice alleged.  We also pointed out to the Tribunal that the MoD has previously released the number of sorties undertaken outside of Operation Shader (In response to our questions about the targeted killing of Naweed Hussain in 2018) without any of the prejudice or harm suggested, but that seems to have been ignored by the tribunal.  Read more