Artificial intelligence (AI) projects that could help to unleash new lethal weapons systems requiring little or no human control are being undertaken by the Ministry of Defence (MoD), according to information released to Drone Wars UK through a Freedom of Information Act request.
The development of lethal autonomous military systems – sometimes described as ‘killer robots’ – is deeply contentious and raises major ethical and human rights issues. Last year the MoD published its Defence Artificial Intelligence Strategy setting out how it intends to adopt AI technology in its activities.
Drone Wars UK asked the MoD to provide it with the list of “over 200 AI-related R&D programmes” which the Strategy document stated the MoD was working on. Details of these programmes were not given in the Strategy itself, and MoD evaded questions from Parliamentarians who have asked for more details of its AI activities.
Although the Defence Artificial Intelligence Strategy claimed that over 200 programmes are underway, only 73 are shown on the list provided to Drone Wars. Release of the names of some projects were refused on defence, security and /or national security grounds.
However, MoD conceded that a list of “over 200” projects was never held when the strategy document was prepared in 2022, explaining that “our assessment of AI-related projects and programmes drew on a data collection exercise that was undertaken in 2019 that identified approximately 140 activities underway across the Front-Line Commands, Defence Science and Technology Laboratory (Dstl), Defence Equipment and Support (DE&S) and other organisations”. The assessment that there were at least 200 programmes in total “was based on our understanding of the totality of activity underway across the department at the time”.
The list released includes programmes for all three armed forces, including a number of projects related to intelligence analysis systems and to drone swarms, as well as more mundane ‘back office’ projects. It covers major multi-billion pound projects stretching over several decades, such as the Future Combat Air System (which includes the proposed new Tempest aircraft), new spy satellites, uncrewed submarines, and applications for using AI in everyday tasks such as predictive equipment maintenance, a repository of research reports, and a ‘virtual agent’ for administration.
However, the core of the list is a scheme to advance the development of AI-powered autonomous systems for use on the battlefield. Many of these are based around the use of drones as a platform – usually aerial systems, but also maritime drones and autonomous ground vehicles. A number of the projects on the list relate to the computerised identification of military targets by analysis of data from video feeds, satellite imagery, radar, and other sources. Using artificial intelligence / machine learning for target identification is an important step towards the development of autonomous weapon systems – ‘killer robots’ – which are able to operate without human control. Even when they are under nominal human control, computer-directed weapons pose a high risk of civilian casualties for a number of reasons including the rapid speed at which they operate and difficulties in understanding the often un-transparent ways in which they make decisions.
The government claims it “does not possess fully autonomous weapons and has no intention of developing them”. However, the UK has consistently declined to support proposals put forward at the United Nations to ban them.
Among the initiatives on the list are the following projects. All of them are focused on developing technologies that have potential for use in autonomous weapon systems.
Project Omniscient, originally undertaken on behalf of the MoD by Cubica Technology (now part of Roke, a subsidiary of Chemring Group), emerged from a 2017 Defence and Security Accelerator innovation competition with the theme ‘Revolutionise the human-information relationship for Defence’. The project aimed to develop systems which will allow drone surveillance operations to take place autonomously, with minimal – or even no – human oversight.
Under the initial phase of the project, Cubica developed and demonstrated an integrated, persistent, beyond visual line of sight (BVLOS) multi-task drone capability able to undertake autonomous surveillance activities using data fusion and sensor management. The system is based around the Ministry of Defence’s SAPIENT software (see below), which allows autonomous sensor systems to use artificial intelligence to make low-level decisions and detect targets on their own to minimise the workload of a human operator.
Omniscient supported the development of Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) systems to use machine learning and automation to speed up the target detection process. The software is reportedly capable of fully autonomous target detection and multi-task planning. Although the system is intended to detect objects of interest and notify the drone operator should it detect anything that needs further investigation, project documentation also states that it is able to operate on autonomous drones (those without a human controller).
Roke is now promoting Omniscient as a ‘High Level Decision Making Module’ which fuses information “from multiple sensors in real-time to identify threats and is capable of controlling steerable sensors and teams of UxVs (UAVs and UGVs) at full Level 5 autonomy.” In plain language, that is autonomously controlling multiple air and ground drones.
Geollect is a Bristol-based subsidiary of Roke, part of the international Chemring group – a defence contractor specialising in sensors, communications, cyber security, and artificial intelligence. Geollect has been “mentored” by GCHQ and has a number of contracts with the Ministry of Defence and other UK security agencies.
Geollect undertakes geospatial analysis, using mapping and location technologies as well as data analysis to generate intelligence information. The company’s software can undertake pattern-of-life analysis to statistically describe ‘normal’ operations in an area, and corroborate this with social media analysis to identify unusual behaviours that may represent risks.
Building on this, Geollect’s ‘Infosight’ information environment assessment platform integrates data from multiple sources and presents it to the operator in a visual way enable to enable better understanding and to help decision making.. According to a report shared by the company it is being used by the Royal Navy for information warfare campaigns and in preparing and implementing psychological operations, or ‘psyops’. Geollect’s products are also used by the Royal Marines to process intelligence data.
Hydra (Robust Autonomy at Scale)
Hydra (Robust Autonomy at Scale) is a project led by the Defence Science and Technology Laboratory (DSTL) through the laboratory’s autonomy programme. The project has a number of partners, including contractors Blue Bear, Fraser Nash Consulting, IQHQ and Rowden Technologies, and international military associates from the US and Australia including the US Air Force Research Laboratory and the Australia Defence Science and Technology Group.
The Hydra project has resulted in the development of the ‘UK/US artificial intelligence toolbox’, which is intended to detect and recognise enemy vehicles from surveillance video and other data collected from uncrewed ground vehicles (UGVs) and uncrewed aerial vehicles (UAVs – conventional drones). Data can be labelled by algorithms from the toolbox and the system can be rapidly trained and updated while the swarm is in flight to adapt to changing mission situations.
The software has been deployed in a number of field trials over the past year, both in the UK and the US, aimed at investigating the practicalities of how artificial intelligence could be used to support swarms of drones on the battlefield, away from centralised computer networks. A trial undertaken in May 2023 under the auspices of the trilateral AUKUS agreement involved units from the UK, US, and Australia in detecting, identifying and tracking military targets with a swarm comprised of drones from all three nations.
NEXUS is an AI-based cloud computing system for sharing information between different combat units, which is under development by the Royal Air Force’s Rapid Capabilities Office. Together with the RAVEN virtual communications node it is intended to provide forces with a ‘common operational picture’ by fusing data from multiple sources – satellites, reconnaissance aircraft, ships at sea or a vehicle or backpack on land – to provide real time intelligence.
Different platforms, networks, sensors and applications can request and feed in data so that applications and platforms alike can see who is doing what and where. The RCO conducted flight trials in 2021 of NEXUS and RAVEN in 2021 on a Voyager aircraft and demonstrated the ability to connect the aircraft with a satellite communications feed to establish a common operating picture and act as an airborne communications node, whilst at the same time conducting air-to-air refuelling tasks.
Conventional surveillance systems such as cameras mounted on CCTV systems or drones simply collect data from their sensors and feed it to a human operator to assess. Monitoring and interpreting lots of data requires high communications bandwidth and places a high cognitive burden on the operators. The MoD’s SAPIENT (Sensing for Asset Protection with Integrated Electronic Networked Technology) software reduces these requirements by enabling individual sensors use AI to detect anomalies and make operating decisions.
Higher-level objectives are managed by a module which controls the overall system and makes some decisions normally made by human operators. Only certain information is sent to human operators, reducing their task of monitoring sensor data. MoD has adopted SAPIENT software as a standard for counter-drone technology and the software was used in the Contested Urban Environment (CUE) military exercise held in Portsmouth in 2021.
Project SPOTTER was presented as a case study to highlight MoD’s innovation in the field of AI in the 2022 Defence Artificial Intelligence Strategy. SPOTTER uses AI to assist imagery analysts detect and identify objects from satellite imagery. It is based around a machine-learning system which has been trained to identify objects of interest.
The system can identify a range of object types in various environments and is able to automatically monitor specified locations continuously in real time for changes in activity.
Bespoke software computer architecture and a user interface was developed for SPOTTER which is scalar and modular and matched to the human analyst’s requirements. A related project, SQUINTER, is based around output from synthetic aperture radar satellite sensors instead of electro-optical sensors.
Chris Cole of Drone Wars UK said:
“It’s clear that the Ministry of Defence (MoD) is crossing a line here. The projects in this list represent the building blocks needed to produce killer robots in the near future. The information revealed in this list raises significant questions about the government’s stated commitment not to develop autonomous weapon systems.
The MoD’s own Artificial Intelligence Strategy accepts that transparency will be essential in gaining acceptance for AI and similar new technologies. It is therefore very disappointing that the list of AI schemes had to be prised out of MoD following an FOI battle, and not released proactively at the time the Strategy document was published. Even the list of projects that has been released falls far short of the full set.
The government has argued that it wishes to see artificial intelligence technologies used for ethical and responsible purposes, and it should therefore use the AI summit planned for later this year to help kickstart a major international initiative to ban killer robots.”
AI on land, at sea, and in the air – and in space
Some of the other projects included on the MoD’s list of AI development programmes are:
700X is a Royal Navy air squadron based at Culdrose in Cornwall which has been established to develop and test fixed wing and rotary wing drones and establish how they might be used in future Carrier Strike and Littoral Strike operations, supporting the role of commando forces.
Enhanced C2 Spearhead (EC2SPHD)
Enhanced Command and Control (C2) Spearhead (EC2SPHD) is an army project investigating how AI can be applied in land warfare to reduce the cognitive burden on commanders and speed up decision-making.
T27 / T31e Offboard UXVs
Development of uncrewed surface and sub-surface autonomous maritime drones which will be able to undertake a range of independent missions in support of larger host ships, in a similar manner to the ‘loyal wingman‘ concept which would see autonomous drones paired with a crewed aircraft.
Project MINERVA (Space Game Changer)
Minerva is the name of a £127 million concept demonstrator project intended to pave the way towards development of ISTARI, a one billion pound network of spy satellites intended to allow global surveillance and intelligence gathering. Minerva will use AI to autonomously collect, process and share data from UK and allied satellites.
Urban Canyon Sixth Sense (UC6S)
Urban Canyon Sixth Sense is a £2.3 million project which aims to investigate how AI can assist military vehicle crew with information management and decision making in an urban environment.
The XL UUV (extra large uncrewed underwater vehicle) project, also known as Project Cetus, will deliver a large autonomous submarine to the Royal Navy. It will be able to work side-by-side with crewed submarines or on independent missions and dive to greater depths than crewed submarines.