Proceed in Harmony: The Government replies to the Lords on AI in Weapon Systems

Click to open

Last December a select committee of the House of Lords published ‘Proceed with Caution’: a report setting out the findings of a year-long investigation into the use of artificial intelligence (AI) in weapon systems.

Members of the Lords committee were drawn entirely from the core of the UK’s political and security establishment, and their report was hardly radical in its conclusions.  Nevertheless, their report made a number of useful recommendations and concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.  The Lords found that Ministry of Defence (MoD) claims to be “ambitious, safe, responsible” in its use of AI had “not lived up to reality”.

The government subsequently pledged to reply to the Lords report, and on 21 February published its formal response.  Perhaps the best way of summarising the tone of the response is to quote from its concluding paragraph:  ““Proceed with caution”, the overall message of this [Lords] report, mirrors the MoD’s approach to AI adoption.”   There is little new in the government response and nothing in it will be of any surprise to observers and analysts of UK government policy on AI and autonomous technologies.  The response merely outlines how the government intends to follow the course of action it had already planned to take, reiterating the substance of past policy statements such as the Defence Artificial Intelligence Strategy and puffing up recent MoD activity and achievements in the military AI field.

As might be imagined, the response takes a supportive approach to recommendations from the Lords which are aligned to its own agenda, such as developing high-quality data sets, improving MoD’s AI procurement arrangements, and undertaking research into potential future AI capabilities.  On the positive side, it is encouraging to see that in some areas concerns over the risks and limitations of AI technologies are highlighted, for example in the need for review and rigorous testing of new systems.  MoD acknowledges that rigorous testing would be required before an operator could be confident in an AI system’s use and effect, that current procedures, including the Article 36 weapons review process, will need to be adapted and updated, and that changes in operational environment may require weapon systems to be retested.

The response also reveals that the government is working on a Joint Service Publication covering all the armed forces to give more concrete directions and guidance on implementing MoD’s AI ethical principles.  The document, ‘Dependable AI in Defence’, will set out the governance, accountabilities, processes and reporting mechanisms needed to translate ethical policies into tangible actions and procedures.  Drone Wars UK and other civil society organisations have long called for MoD to formulate such guidance as a priority.

In some areas the MoD has relatively little power to meet the committee’s recommendations, such as in adjusting government pay scales to match market rates and attract qualified staff to work on MoD AI projects.  Here the rejoinder is little more than flannel, mentioning that “a range of steps” are being taken “to make Defence AI an attractive and aspirational choice.”

In other respects the Lords have challenged MoD’s approach more substantially, and in such cases these challenges are rejected in the government response.  This is so in relation to the Lords’ recommendation that the government should adopt a definition for autonomous weapons systems (AWS).  The section of the response dealing with this point lays bare the fact that the government’s priority “is to maximise our military capability in the face of growing threats”.  A rather unconvincing assertion that “the irresponsible and unethical behaviours and outcomes about which the Committee is rightly concerned are already prohibited under existing legal mechanisms” is followed by the real reason for the government’s opposition: “there is a strong tendency in the ongoing debate about autonomous weapons to assert that any official AWS definition should serve as the starting point for a new legal instrument prohibiting certain types of systems”.  Any international treaty which would outlaw autonomous weapon systems “represents a threat to UK Defence interests” the government argues.  The argument ends with a side-swipe at Russia and an attempt to shut down further debate by claiming that the debate is taking place “at the worst possible time, given Russia’s action in Ukraine and a general increase in bellicosity from potential adversaries.”  This basically seems to be saying that in adopting a definition for autonomous weapon systems the UK would be making itself more vulnerable to Russian military action.  Really? Read more

Developments on both sides of the Atlantic signal push to develop AI attack drones

Artist impression of crewed aircraft operating with autonomous drones. Credit: Lockheed Martin

Recent government and industry announcements signal clear intent by both the US and the UK to press ahead with the development of a new generation of AI attack drones despite serious concerns about the development of autonomous weapons. While most details are being kept secret, it is clear from official statements, industry manoeuvring and budget commitments that these new drones are expected to be operational by the end of the decade.

The current focus is the development of drones that were previously labelled as ‘loyal wingman’ but are now being described either as ‘Collaborative Combat Aircraft (CCA) or ‘Autonomous Collaborative Platforms (ACP).  As always, the nomenclature around ‘drones’ is a battlefield itself.  The concept for this type of drone is for one or more to fly alongside, or in the vicinity of, a piloted military aircraft with the drones carrying out specifically designated tasks such as surveillance, electronic warfare, guiding weapons onto targets, or carrying out air-to-air or air-to-ground strikes.  Rather than being directly controlled by individual on the ground such as current armed drones like the Reaper or Bayraktar, these drones will fly autonomously. According to DARPA officials (using the beloved sports metaphor) these drone will allow pilots to direct squads of unmanned aircraft “like a football coach who chooses team members and then positions them on the field to run plays.”

Next Generation

In May, the US Air Force issued a formal request for US defence companies to bid to build a new piloted aircraft to replace the F-22.  However, equally important for the ‘Next Generation Air Dominance (NGAD)’ program is the development of new autonomous drones and a ‘combat cloud’ communication network.  While the development of the drones is a covert programme, US Air Force Secretary Frank Kendall said they will be built “in parallel” to the piloted aircraft. Kendall publicly stated that the competition to develop CCA was expected to begin in Fiscal Year 2024 (note this runs from Oct 2023 to Sept 2024).

While it is planning to build around 200 of the new crewed aircraft, Kendall told reporters that the USAF is expecting to build around 1,000 of the drones. “This figure was derived from an assumed two CCAs per 200 NGAD platforms and an additional two for each of 300 F-35s for a total of 1,000,” Kendall explained. Others expect even more of these drones to be built.  While the NGAD fighter aircraft itself is not expected to be operational until the 2030s, CCA’s are expected to be deployed by the end of the 2020’s.

It’s important to be aware that there will not be one type of drone built under this programme, but a range with different capabilities able to carry out different tasks.  Some of them will be ‘expendable’ – that is designed for just one mission – something like the ‘one-way attack’ drones that we have seen increasing used in Ukraine and elsewhere; some will be ‘attritable’, that is designed that if they are lost in combat it would not be severely damaging, while others, described as ‘exquisite’ will be more capable and specifically designed not to be lost during combat.  A number of companies have set out their proposals, with some even building prototypes and undertaking test flights. Read more

Small drones, big problem. Two new reports examine the rise and rise of armed ‘one-way attack’ drones 

From top: Israeli Harop, Iranian Shaded 136, Polish Warmate.

Over the past few years  – and particularly in the on-going war in Ukraine – we have seen the rise in use of what has become known as ‘kamikaze’ or  ‘suicide’ drones.  Two new excellent reports have just been released which examine these systems.  ‘One-Way Attack Drones: Loitering Munitions of the Past and Present’ written by Dan Gettinger, formerly of the Bard Drone Center and ‘Loitering Munitions and Unpredictability’ by Ingvild Bode & Tom Watts,  examine between them the history, current use, and growing concern about the increasing autonomy of such systems.

A drone by any other name…?

Firstly, to address the elephant in the room: are these systems ‘drones’?

Naming has always been a keenly fought aspect of the debate about drones, with sometimes bitter conflict over whether such platforms should be named ‘unmanned aerial vehicles’ (UAVS), ‘remotely piloted air systems’ (RPAS) or ‘drones’. ‘Drones’ has been the term that has stuck, particularly in mainstream media, but is regularly used interchangeably with UAV (with ‘unmanned’ being replaced in recent years by ‘uncrewed’ for obvious reasons).  While many in the military now accept the term ‘drone’ and are happy to use it depending on the audience, some continue to insist that it belittles both capabilities of the system and those who operate them.

Whichever term is used, a further aspect of the naming debate is that an increasing number and type of military aerial systems are being labelled as ‘drones’.  While all these systems have significant characteristics in common (aerial systems, unoccupied, used for surveillance/intelligence gathering and/or attack), they can also be very different in terms of size and range; can carry out very different missions; have different effects; and raise different legal and ethical issues.

One type of such system is the so-called ‘suicide’ or ‘kamikaze’ drone  – perhaps better labelled ‘one way attack’ drone.  There are several different categories of this type of drone, and while they are are used to carry out remote lethal attacks and therefore have significant aspects in common with the much larger Reaper or Bayraktar drones, they are significantly different in that they are not designed to be re-used, but rather are expendable as the warhead is part of the structure of the system which is destroyed in use. Importantly, while loitering munitions are a sub-set of ‘one way attack’ drones, not all one-way attack drones are loitering munitions.

Dan Gettinger’s report ‘One-Way Attack Drones: Loitering Munitions of the Past and Present’ helpfully sets out a history of the development of these systems and identifies three sub categories: anti-radar systems, portable or ‘backpackable’ systems and Iranian systems.  He has compiled a helpful dataset of over 200 such systems (although not all are currently in operation). All of these, he argues, can be traced back to “the transition from the era of jet-powered target drones to that of remotely piloted vehicles.” Read more

The UK, accountability for civilian harm, and autonomous weapon systems

Second evidence session. Click to watch video

The second public session of the House of Lords inquiry into artificial intelligence (AI) in weapon systems took place at the end of March.  The session examined how the development and deployment of autonomous weapons might impact upon the UK’s foreign policy and its position on the global stage and heard evidence from Yasmin Afina, Research Associate at Chatham House, Vincent Boulanin, Director of Governance of Artificial Intelligence at the Stockholm International Peace Research Institute, and Charles Ovink, Political Affairs Officer at United Nations Office for Disarmament.

Among the wide range of issues covered in the two-hour session was the question of who could be held accountable if human rights abuses were committed by a weapon system acting autonomously.  A revealing exchange took place between Lord Houghton, a former Chief of Defence Staff (the most senior officer of the UK’s armed forces), and Charles Ovink.  Houghton asked whether it might be possible for an autonomous weapon system to comply with the laws of war under certain circumstances (at 11.11 in the video of the session):

“If that fully autonomous system has been tested and approved in such a way that it doesn’t rely on a black box technology, that constant evaluation has proved that the risk of it non-complying with the parameters of international humanitarian law are accepted, that then there is a delegation effectively from a human to a machine, why is that not then compliant, or why would you say that that should be prohibited?”

This is, of course, a highly loaded question that assumes that a variety of improbable circumstances would apply, and then presents a best-case scenario as the norm.  Ovink carefully pointed out that any decision on whether such a system should be prohibited would be for United Nations member states to decide, but that the question posed ‘a big if’, and it was not clear what kind of test environment could mimic a real-life warzone with civilians present and guarantee that the laws of war would be followed.  Even if this was the case, there would still need to be a human accountable for any civilian deaths that might occur.  Read more

Fine words, Few assurances: Assessing new MoD policy on the military use of Artificial Intelligence

Drone Wars UK is today publishing a short paper analysing the UK’s approach to the ethical issues raised by the use of artificial intelligence (AI) for military purposes in two recently policy documents.  The first part of the paper reviews and critiques the Ministry of Defence’s (MoD’s) Defence Artificial Intelligence Strategy published in June 2022, while the second part considers the UK’s commitment to ‘responsible’ military artificial intelligence capabilities, presented in the document ‘Ambitious, Safe, Responsible‘  published alongside the strategy document.

What was once the realm of science fiction, the technology needed to build autonomous weapon systems is currently under development by in a number of nations, including the United Kingdom.  Due to recent advances in unmanned aircraft technology, it is likely that the first autonomous weapons will be a drone-based system.

Drone Wars UK believes that the development and deployment of AI-enabled autonomous weapons would give rise to a number of grave risks, primarily the loss of human values on the battlefield.  Giving machines the ability to take life crosses a key ethical and legal Rubicon.  Lethal autonomous drones would simply lack human judgment and other qualities that are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.

In the short term it is likely that the military applications of autonomous technology will be in low risk areas, such logistics and the supply chain, where, proponents argue, there are cost advantages and minimal implications for combat situations.  These systems are likely to be closely supervised by human operators.  In the longer term, as technology advances and AI becomes more sophisticated, autonomous technology is increasingly likely to become weaponised and the degree of human supervision can be expected to drop.

The real issue perhaps is not the development of autonomy itself but the way in which this milestone in technological development is controlled and used by humans.  Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities.   These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing autonomous weapons systems.  Read more

Loitering munitions, the Ukraine war, and the drift towards ‘killer robots’.

Switchblade loitering munition flies towards target area. The operator views video feed and then designates which  target the munition should strike.

Loitering munitions are now hitting the headlines in the media as a result of their use in the Ukraine war.  Vivid descriptions of ‘kamikaze drones’ and ‘suicide drones’ outline the way in which these weapons operate: they are able to find targets and fly towards them before crashing into them and exploding.  Both Russia and Ukraine are deploying loitering munitions, which allow soldiers to fire on targets such as tanks and heavy armour without the predictability of a mortar or artillery round firing on a set trajectory.   Under some circumstances these ‘fire and forget’ weapons may be able operate with a high degree of autonomy.  For example they can programmed to fly around autonomously in a defined search area and highlight possible targets such as tanks to the operator.  In these circumstances they can be independent of human control. This trend towards increasing autonomy in weapons systems raising questions about how they might shape the future of warfare and the morality of their use.

Loitering munitions such as these have previously been used to military effect in Syria and the 2020 Nagorno-Karabakh war.  Although they are often described as drones, they are in many ways more like a smart missile than an uncrewed aircraft.  Loitering munitions were first developed in the 1980s and can be thought of as a ‘halfway house’ between drones and cruise missiles.  They differ from drones in that they are expendable, and unlike cruise missiles, have the ability to loiter passively in the target area and search for a target.  Potential targets are identified using radar, thermal imaging, or visual sensor data and, to date, a human operator selects the target and executes the command to destroy the target.  They are disposable, one-time use weapons intended to hunt for a target and then destroy it, hence their tag as ‘kamikaze’ weapons.  Dominic Cummings, former chief advisor to the Prime Minister describes a loitering munition as a “drone version of the AK-47: a cheap anonymous suicide drone that flies to the target and blows itself up – it’s so cheap you don’t care”.  Read more