
The use of artificial intelligence (AI) for the purposes of warfare through the development of AI-powered autonomous weapon systems – ‘killer robots’ – “is one of the most controversial uses of AI today”, according to a new report by an influential House of Lords Committee.
The committee, which spent ten months investigating the application of AI to weapon systems and probing the UK government’s plans to develop military AI systems, concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.
Echoing concerns which Drone Wars UK has repeatedly raised, the Lords found that the stated aspiration of the Ministry of Defence (MoD) to be “ambitious, safe, responsible” in its use of AI “has not lived up to reality”, and that although MoD has claimed that transparency and challenge are central to its approach, “we have not found this yet to be the case”.
The cross-party House of Lords Committee on AI in Weapon Systems was set up in January 2023 at the suggestion of Liberal Democrat peer Lord Clement-Jones, and started taking evidence in March. The committee heard oral evidence from 35 witnesses and received nearly 70 written evidence submissions, including evidence from Drone Wars UK.
The committee’s report is entitled ‘Proceed with Caution: Artificial Intelligence in Weapon Systems’ and ‘proceed with caution’ gives a fair summary of its recommendations. The panel was drawn entirely from the core of the UK’s political and military establishment, and at times some members appeared to have difficulty in grasping the technical concepts underpinning the technologies behind autonomous weapons. Under the circumstances the committee was never remotely likely to recommend that the government should not commit to the development of new weapons systems based on advanced technology, and in many respects its report provides a road-map setting out the committee’s views on how the MoD should go ahead in integrating AI into weapons systems and build public support for doing this.
Nevertheless, the committee has taken a sceptical view of the advantages claimed for autonomous weapons systems; has recognised the very real risks that they pose; and has proposed safeguards to mitigate the worst of the risks alongside a robust call for the government to “lead by example in international engagement on regulation of AWS [autonomous weapon systems]”. Despite hearing from witnesses who argued that autonomous weapons “could be faster, more accurate and more resilient than existing weapon systems, could limit the casualties of war, and could protect “our people from harm by automating ‘dirty and dangerous’ tasks””, the committee was apparently unconvinced, concluding that “although a balance sheet of benefits and risks can be drawn, determining the net effect of AWS is difficult” – and that “this was acknowledged by the Ministry of Defence”.
Perhaps the most important recommendation in the committee’s report relates to human control over autonomous weapons. The committee found that:
The Government should ensure human control at all stages of an AWS’s lifecycle. Much of the concern about AWS is focused on systems in which the autonomy is enabled by AI technologies, with an AI system undertaking analysis on information obtained from sensors. But it is essential to have human control over the deployment of the system both to ensure human moral agency and legal compliance. This must be buttressed by our absolute national commitment to the requirements of international humanitarian law.
The government should proceed with caution in development and use of AI in autonomous weapons.
Today we publish our new report looking at use of AI-enabled autonomous weapons systems.
📄 See key recommendations and links to full reporthttps://t.co/h05Ig0dC1k
🔽 Watch for more pic.twitter.com/N6oHXU0tpN
— House of Lords AI in Weapon Systems Committee (@HlAIWeapons) December 1, 2023
This position will be welcomed by those campaigning for international controls over autonomous weapons systems, such as the Campaign to Stop Killer Robots, of which Drone Wars UK is a member. Meaningful human control over autonomous weapons systems, with a ban on systems specifically targeting humans backed up by reaffirmation of existing commitments under international law, are key demands for the campaign.
Alongside this is a powerful appeal for the UK to do more in the international arena to support arms control measures relating to autonomous weapons. The government’s current position, which is that discussion on controls on autonomous weapons should be confined to a group of government experts under the auspices of the United Nations Convention on Certain Conventional Weapons, is described as “surprising” by the committee given that discussion in the group of experts over the past 12 months “has led to no substantive results”. The government “should step up and show leadership in these future discussions, consistent with their ambitions in the 2021 Integrated Review to be more active in shaping the international order. It is not in UK interests “to leave international law and norm setting to others.””
The committee took an unequivocal position in support of an international treaty on autonomous weapons:
We call for a swift agreement of an effective international instrument on lethal AWS. It is crucial to develop an international consensus on what criteria should be met for a system to be compliant with IHL. Central to this is the retention of human moral agency. Noncompliant systems should be prohibited. Consistent with its ambitions to promote the safe and responsible development of AI around the world, the Government should be a leader in this effort.
Pressure from the committee in this respect may well have been an influence on the UK government’s decision to support a resolution stressing the “urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems” at the United Nations General Assembly in November 2023.
The report raises particular concerns over the use of AI in command, control, and communications relating to nuclear weapons, highlighting its “potential to spur arms races or increase the likelihood of states escalating to nuclear use – either intentionally or accidentally – during a crisis” and warning that “the compressed time for decision-making when using AI may lead to increased tensions, miscommunication, and misunderstanding”. The Lords called for the government to lead international efforts to achieve a prohibition on the use of AI in nuclear command, control and communications.
The MoD’s approach to transparency and ethics in its use of AI – set out in the Defence Artificial Intelligence Strategy and the accompanying paper ‘Ambitious, Safe, Responsible’ was the subject of particular criticism from the Lords committee. The Lords felt it was important for the government to “seek, establish and retain public confidence and democratic endorsement in the development and use of AI generally, and especially in respect of AWS”, but “there is no sign of a move towards democratisation of this debate”. The AI Ethics Advisory Panel, set up to provide independent advice to MoD, was found to be wanting and there was a need to improve “the transparency of advice provided by the AI Ethics Advisory Panel by publishing its Terms of Reference, membership, agendas, and minutes, as well as an annual transparency report”. MoD should “immediately expand the remit of the AI Ethics Advisory Panel to review the practical application of ethical principles in armed conflict and to cover ethics in relation to the development and use of AI in AWS”. These are measures that Drone Wars UK has long been calling for the MoD to adopt.
Looking at transparency at higher levels in government, the Committee expressed concerns about the lack of oversight for the government’s AI programme, particularly at the Parliamentary level:
Parliament is at the centre of decision-making on the development and use of AWS. Parliament’s capacity for oversight depends on the availability of information, on its ability to anticipate issues rather than reacting after the event, and on its ability to hold ministers to account. The Government must allow sufficient space in the Parliamentary timetable and provide enough information for Parliament, including its select committees, to scrutinise its policy on AI effectively. We naturally understand that elements of policy development may be highly sensitive, but there are established ways of dealing with such information. Arguments of secrecy must not be used to sidestep accountability.
The lack of publicly available detail on UK spending on military AI was flagged up, and the committee called for the government to publish figures for annual spending on AI by the MoD.
The Lords Committee has conducted a thorough examination of the issues surrounding autonomous weapons and a detailed critique of government policy. Given the credibility that committee members have within the UK’s political and military establishment it will be difficult for the government to side-step their recommendations, and their report provides useful resources for campaigners working to establish international legal protections over the use of autonomous weapons and increase accountability over their development. Perhaps the final word should be left to the Under-Secretary for Multilateral Affairs and International Economic Relations for the Philippines, Carlos J. Sorreta, who the Lords quote in their report:
“AI in the military domain is ultimately about speed in waging war. Speed might be good for waging war, but perhaps not so much for peace. Delays in armed conflict are critical breathing spaces for diplomacy to work, for peace to be given a chance.”