Proceed in Harmony: The Government replies to the Lords on AI in Weapon Systems

Click to open

Last December a select committee of the House of Lords published ‘Proceed with Caution’: a report setting out the findings of a year-long investigation into the use of artificial intelligence (AI) in weapon systems.

Members of the Lords committee were drawn entirely from the core of the UK’s political and security establishment, and their report was hardly radical in its conclusions.  Nevertheless, their report made a number of useful recommendations and concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.  The Lords found that Ministry of Defence (MoD) claims to be “ambitious, safe, responsible” in its use of AI had “not lived up to reality”.

The government subsequently pledged to reply to the Lords report, and on 21 February published its formal response.  Perhaps the best way of summarising the tone of the response is to quote from its concluding paragraph:  ““Proceed with caution”, the overall message of this [Lords] report, mirrors the MoD’s approach to AI adoption.”   There is little new in the government response and nothing in it will be of any surprise to observers and analysts of UK government policy on AI and autonomous technologies.  The response merely outlines how the government intends to follow the course of action it had already planned to take, reiterating the substance of past policy statements such as the Defence Artificial Intelligence Strategy and puffing up recent MoD activity and achievements in the military AI field.

As might be imagined, the response takes a supportive approach to recommendations from the Lords which are aligned to its own agenda, such as developing high-quality data sets, improving MoD’s AI procurement arrangements, and undertaking research into potential future AI capabilities.  On the positive side, it is encouraging to see that in some areas concerns over the risks and limitations of AI technologies are highlighted, for example in the need for review and rigorous testing of new systems.  MoD acknowledges that rigorous testing would be required before an operator could be confident in an AI system’s use and effect, that current procedures, including the Article 36 weapons review process, will need to be adapted and updated, and that changes in operational environment may require weapon systems to be retested.

The response also reveals that the government is working on a Joint Service Publication covering all the armed forces to give more concrete directions and guidance on implementing MoD’s AI ethical principles.  The document, ‘Dependable AI in Defence’, will set out the governance, accountabilities, processes and reporting mechanisms needed to translate ethical policies into tangible actions and procedures.  Drone Wars UK and other civil society organisations have long called for MoD to formulate such guidance as a priority.

In some areas the MoD has relatively little power to meet the committee’s recommendations, such as in adjusting government pay scales to match market rates and attract qualified staff to work on MoD AI projects.  Here the rejoinder is little more than flannel, mentioning that “a range of steps” are being taken “to make Defence AI an attractive and aspirational choice.”

In other respects the Lords have challenged MoD’s approach more substantially, and in such cases these challenges are rejected in the government response.  This is so in relation to the Lords’ recommendation that the government should adopt a definition for autonomous weapons systems (AWS).  The section of the response dealing with this point lays bare the fact that the government’s priority “is to maximise our military capability in the face of growing threats”.  A rather unconvincing assertion that “the irresponsible and unethical behaviours and outcomes about which the Committee is rightly concerned are already prohibited under existing legal mechanisms” is followed by the real reason for the government’s opposition: “there is a strong tendency in the ongoing debate about autonomous weapons to assert that any official AWS definition should serve as the starting point for a new legal instrument prohibiting certain types of systems”.  Any international treaty which would outlaw autonomous weapon systems “represents a threat to UK Defence interests” the government argues.  The argument ends with a side-swipe at Russia and an attempt to shut down further debate by claiming that the debate is taking place “at the worst possible time, given Russia’s action in Ukraine and a general increase in bellicosity from potential adversaries.”  This basically seems to be saying that in adopting a definition for autonomous weapon systems the UK would be making itself more vulnerable to Russian military action.  Really? Read more

Proceed with caution: Lords warn over development of military AI and killer robots

Click to open report

The use of artificial intelligence (AI) for the purposes of warfare through the development of AI-powered autonomous weapon systems – ‘killer robots’ –  “is one of the most controversial uses of AI today”, according to a new report by an influential House of Lords Committee.

The committee, which spent ten months investigating the application of AI to weapon systems and probing the UK government’s plans to develop military AI systems, concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.

Echoing concerns which Drone Wars UK has repeatedly raised, the Lords found that the stated aspiration of the Ministry of Defence (MoD) to be “ambitious, safe, responsible” in its use of AI “has not lived up to reality”, and that although MoD has claimed that transparency and challenge are central to its approach, “we have not found this yet to be the case”.

The cross-party House of Lords Committee on AI in Weapon Systems was set up in January 2023 at the suggestion of Liberal Democrat peer Lord Clement-Jones, and started taking evidence in March.    The committee heard oral evidence from 35 witnesses and received nearly 70 written evidence submissions, including evidence from Drone Wars UK.

The committee’s report is entitled ‘Proceed with Caution: Artificial Intelligence in Weapon Systems’ and ‘proceed with caution’ gives a fair summary of its recommendations.  The panel was drawn entirely from the core of the UK’s political and military establishment, and at times some members appeared to have difficulty in grasping the technical concepts underpinning the technologies behind autonomous weapons.  Under the circumstances the committee was never remotely likely to recommend that the government should not commit to the development of new weapons systems based on advanced technology, and in many respects its report provides a road-map setting out the committee’s views on how the MoD should go ahead in integrating AI into weapons systems and build public support for doing this.

Nevertheless, the committee has taken a sceptical view of the advantages claimed for autonomous weapons systems; has recognised the very real risks that they pose; and has proposed safeguards to mitigate the worst of the risks alongside a robust call for the government to “lead by example in international engagement on regulation of AWS [autonomous weapon systems]”.  Despite hearing from witnesses who argued that autonomous weapons “could be faster, more accurate and more resilient than existing weapon systems, could limit the casualties of war, and could protect “our people from harm by automating ‘dirty and dangerous’ tasks””, the committee was apparently unconvinced, concluding that “although a balance sheet of benefits and risks can be drawn, determining the net effect of AWS is difficult” – and that “this was acknowledged by the Ministry of Defence”.

Perhaps the  most important recommendation in the committee’s report relates to human control over autonomous weapons.  The committee found that:

The Government should ensure human control at all stages of an AWS’s lifecycle. Much of the concern about AWS is focused on systems in which the autonomy is enabled by AI technologies, with an AI system undertaking analysis on information obtained from sensors. But it is essential to have human control over the deployment of the system both to ensure human moral agency and legal compliance. This must be buttressed by our absolute national commitment to the requirements of international humanitarian law.

Read more

Tribunal upholds MoD refusal to disclose details of UK Reaper drone missions outside of Op Shader

Click to read Decision Notice

Fifteen months after hearing our appeal, an Information Tribunal handed down its decision this week rejecting our arguments that basic details about the deployment of armed Reaper drones outside of Operation Shader (Iraq/Syria) by the UK needed to be released to enable public and parliamentary oversight over such deployments.

Both Clive Lewis MP and Baroness Vivienne Stern, Vice-Chair of the All Party Parliamentary Group (APPG) on Drones and Modern Conflict had submitted statements to the Tribunal supporting our appeal.  Clive Lewis argued that  the refusal to answer these questions about the deployment of Reaper is “a serious backward step in terms of transparency and accountability.”   Baroness Stern stated:

“Despite repeated attempts by myself and colleagues to attain even the most basic information about the UK’s drone deployments, policy, and commitments, Parliament has not been provided with the accurate and timely information needed to meaningfully carry out its constitutional scrutiny role. Whilst certain details must be kept secret in order to ensure operational and national security, the current trend of withholding information about the use of drones purely because it is seen as an “intelligence” asset, as well as withholding vital information on the UK’s growing military capabilities and commitments is deeply concerning and unjustified.”

While insisting that it was neither confirming nor denying the deployment, the MoD argued against the release of the information on three broad grounds . As the Decision Notice states:

“the MOD’s key concern about the release of the requested information was that it could lead an adversary to infer the absence or presence of UK personnel. In his [The MoD’s witness’] opinion were the locations to be released or inferred from a combination of requested data and already published material (the “mosaic effect”), there would be an elevated risk to any potential personnel in that location and an increased risk of hostile acts against them.”

A second concern was

“there would be an increased risk to any nation hosting the Reaper operations as an adversary may target a hostile act at the host nation rather than the UK which may be a more difficult target. Thereby undermining the UK’s relationship with that nation and undermining military operations conducted from that location.”

Finally, and most concerning from a scrutiny and oversight point of view the MoD argued (again quoting Decision Notice)

“The effectiveness of operations conducted using Reaper outside Operation Shader in future depend, in part, on a greater degree of ambiguity as to the employment of Reaper in order to be successful. It is important to retain a degree of ambiguity regarding the full extent of Reaper operations now in order to maintain this flexibility in the future. “

Drone Wars argued strongly that the information requested –  a single figure of the number of sorties undertaken outside of Operation Shader and their broad, geographic location (i.e. ‘The Middle East’) – was not capable of causing the prejudice alleged.  We also pointed out to the Tribunal that the MoD has previously released the number of sorties undertaken outside of Operation Shader (In response to our questions about the targeted killing of Naweed Hussain in 2018) without any of the prejudice or harm suggested, but that seems to have been ignored by the tribunal.  Read more

Technology and the future of UK Foreign Policy – Our submission to the Foreign Affairs Committee Inquiry

Click to open

In a timely and welcome move, the House of Commons Foreign Affairs Select Committee has recently launched an investigation into ‘Tech and the future of UK foreign policy‘.  Recognising that new and emerging technologies are fundamentally altering the nature of international relations and the rapidly growing influence of private technology companies, the Committee’s inquiry intends to focus on how the government, and particularly the Foreign, Commonwealth, and Development Office (FCDO) should respond to the opportunities and challenges presented by new technologies.

A broad selection of stakeholders have already provided written evidence to the Committee, ranging from big technology companies such as Microsoft, Oracle, and BAE Systems, to academics and industry groups with specialist interests in the field.  Non-government organisations, including ourselves, as well as the International Committee of the Red Cross, Amnesty International UK, and the UK Campaign to Stop Killer Robots have also provided evidence.

Not surprisingly, submissions from industry urge the government to support and push ahead with the development of new technologies, with Microsoft insisting that the UK “must move more quickly to advance broad-based technology innovation, which will require “an even closer partnership between the government and the tech sector”.  BAE Systems calls for “a united front [which] can be presented in promoting the UK’s overseas interests across both the public and private sectors”.  Both BAE and Microsoft see roles for new technology in the military: BAE point out that “technology is also reshaping national security”, while Microsoft calls for “cooperation with the private sector in the context of NATO”. Read more

Five years on from UK’s first drone targeted killing, increasing secrecy needs serious challenge

Secret British drone operations getting little scrutiny

The long delay to the release of the Intelligence and Security Committee’s Russia report showed all too clearly just how much control the government can wield over Parliament’s weak powers of scrutiny.  While the ramification of this latest setback to parliament’s role of holding the executive to account are still being worked out, the consequences of a similar failure five years ago – when MPs attempted to investigate the use of drones by British forces for targeted killing –  are now apparent.  This should act as a salutary reminder of the need for MPs to constantly push to strengthen their oversight powers.

Five years ago today (21 August 2015), an RAF Reaper drone operating over Syria launched a missile at a vehicle travelling along a dusty road in Raqqa, killing its three occupants including the target of the strike, 21-year old Cardiff-born Reyaad Khan. The targeted killing caused a storm of controversy when then PM David Cameron reported it to parliament three weeks later. The government had not only for the first time launched a lethal strike in a country in which it was not at war, but had also defied a resolution supporting use of force in Iraq though specifically ruling it out in Syria. The government insisted that the operation was necessary as Khan was instigating and encouraging terror attacks in the UK. Read more

Fallon to face questions on drone targeted killing – but will there be answers?

Secretary of State for Defence, Michael Fallon

UK Defence Secretary Michael Fallon will face questions from the Human Rights Select Committee on Wednesday (16 Dec) over the targeted killing of two British men in a UK drone strike.  21-year old Reyaad Khan from Cardiff was killed in the strike in Syria on 21 August 2015 alongside 26-year old Ruhul Amin from Aberdeen and an unknown third man.

As the Prime Minster acknowledged in his statement to the House of Commons,the air strike was a significant departure from previous military operations: Read more