Fine words, Few assurances: Assessing new MoD policy on the military use of Artificial Intelligence

Drone Wars UK is today publishing a short paper analysing the UK’s approach to the ethical issues raised by the use of artificial intelligence (AI) for military purposes in two recently policy documents.  The first part of the paper reviews and critiques the Ministry of Defence’s (MoD’s) Defence Artificial Intelligence Strategy published in June 2022, while the second part considers the UK’s commitment to ‘responsible’ military artificial intelligence capabilities, presented in the document ‘Ambitious, Safe, Responsible‘  published alongside the strategy document.

What was once the realm of science fiction, the technology needed to build autonomous weapon systems is currently under development by in a number of nations, including the United Kingdom.  Due to recent advances in unmanned aircraft technology, it is likely that the first autonomous weapons will be a drone-based system.

Drone Wars UK believes that the development and deployment of AI-enabled autonomous weapons would give rise to a number of grave risks, primarily the loss of human values on the battlefield.  Giving machines the ability to take life crosses a key ethical and legal Rubicon.  Lethal autonomous drones would simply lack human judgment and other qualities that are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.

In the short term it is likely that the military applications of autonomous technology will be in low risk areas, such logistics and the supply chain, where, proponents argue, there are cost advantages and minimal implications for combat situations.  These systems are likely to be closely supervised by human operators.  In the longer term, as technology advances and AI becomes more sophisticated, autonomous technology is increasingly likely to become weaponised and the degree of human supervision can be expected to drop.

The real issue perhaps is not the development of autonomy itself but the way in which this milestone in technological development is controlled and used by humans.  Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities.   These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing autonomous weapons systems.  Read more

Loitering munitions, the Ukraine war, and the drift towards ‘killer robots’.

Switchblade loitering munition flies towards target area. The operator views video feed and then designates which  target the munition should strike.

Loitering munitions are now hitting the headlines in the media as a result of their use in the Ukraine war.  Vivid descriptions of ‘kamikaze drones’ and ‘suicide drones’ outline the way in which these weapons operate: they are able to find targets and fly towards them before crashing into them and exploding.  Both Russia and Ukraine are deploying loitering munitions, which allow soldiers to fire on targets such as tanks and heavy armour without the predictability of a mortar or artillery round firing on a set trajectory.   Under some circumstances these ‘fire and forget’ weapons may be able operate with a high degree of autonomy.  For example they can programmed to fly around autonomously in a defined search area and highlight possible targets such as tanks to the operator.  In these circumstances they can be independent of human control. This trend towards increasing autonomy in weapons systems raising questions about how they might shape the future of warfare and the morality of their use.

Loitering munitions such as these have previously been used to military effect in Syria and the 2020 Nagorno-Karabakh war.  Although they are often described as drones, they are in many ways more like a smart missile than an uncrewed aircraft.  Loitering munitions were first developed in the 1980s and can be thought of as a ‘halfway house’ between drones and cruise missiles.  They differ from drones in that they are expendable, and unlike cruise missiles, have the ability to loiter passively in the target area and search for a target.  Potential targets are identified using radar, thermal imaging, or visual sensor data and, to date, a human operator selects the target and executes the command to destroy the target.  They are disposable, one-time use weapons intended to hunt for a target and then destroy it, hence their tag as ‘kamikaze’ weapons.  Dominic Cummings, former chief advisor to the Prime Minister describes a loitering munition as a “drone version of the AK-47: a cheap anonymous suicide drone that flies to the target and blows itself up – it’s so cheap you don’t care”.  Read more

None too clever? Military applications of artificial intelligence

Drone Wars UK’s latest briefing looks at where and how artificial intelligence is currently being applied in the military context and considers the legal and ethical, operational and strategic risks posed.

Click to open

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society.  Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function.  However, like all technologies, AI also poses risks if it is poorly understood, unregulated, or used in inappropriate or dangerous ways.

In current AI applications, machines perform a specific task for a specific purpose.  The umbrella term ‘computational methods’ may be a better way of describing such systems, which fall far short of human intelligence but have wider problem-solving capabilities than conventional software.  Hypothetically, AI may eventually be able to perform a range of cognitive functions, respond to a wide variety of input data, and understand and solve any problem that a human brain can.  Although this is a goal of some AI research programmes, it remains a distant  prospect.

AI does not operate in isolation, but functions as a ‘backbone’ in a broader system to help the system achieve its purpose.  Users do not ‘buy’ the AI itself; they buy products and services that use AI or upgrade a legacy system with new AI technology.  Autonomous systems, which are machines able to execute a task without human input, rely on artificial intelligence computing systems to interpret information from sensors and then signal actuators, such as motors, pumps, or weapons, to cause an impact on the environment around the machine.  Read more

The iWars Survey: Mapping the IT sector’s involvement in developing autonomous weapons

A new survey by Drone Wars has begun the process of mapping the involvement of information technology corporations in military artificial intelligence (AI) and robotics programmes, an area of rapidly increasing focus for the military.  ‘Global Britain in a Competitive Age’, the recently published integrated review of security, defence, development, and foreign policy, highlighted the key roles that new military technologies will play in the government’s vision for the future of the armed forces and aspirations for the UK to become a “science superpower”.

Although the integrated review promised large amounts of public funding and support for research in these areas, co-operation from the technology sector will be essential in delivering ‘ready to use’ equipment and systems to the military.  Senior military figures are aware that ‘Silicon Valley’ is taking the lead in  the development of autonomous systems for both civil and military use’. Speaking at a NATO-organised conference aimed at fostering links between the armed forces and the private sector, General Sir Chris Deverell, the former Commander of Joint Forces Command explained:

“The days of the military leading scientific and technological research and development have gone. The private sector is innovating at a blistering pace and it is important that we can look at developing trends and determine how they can be applied to defence and security”

The Ministry of Defence is actively cultivating technology sector partners to work on its behalf through schemes like the Defence and Security Accelerator (DASA). However, views on co-operation with the military by those within the commercial technology sector are mixed. Over the past couple of  years there are been regular reports of opposition by tech workers to their employer’s military contacts including those at Microsoft and GoogleRead more

Online Event – 25 March: Meaning-less human control: Lessons from air defence systems for LAWS

Together with Center for War Studies of University of Southern Denmark, we are hosting an online event on Thursday 25 March at 2pm (GMT) to discuss our co-published report, Meaning-less human control: Lessons from air defence systems for Lethal Autonomous Weapon Systems (LAWS).

In recent years, autonomous weapons systems have increasingly come to the attention of the international community. Debates on these weapon systems centre on whether they reduce meaningful human control over the use of force.  This event will discuss our latest report with an expert panel:

  • Dr Ingvild Bode (Associate Professor of International Relations: Centre for War Studies, University of Southern Denmark)
  • Maaike Verbruggen TBC (Doctoral Researcher: International Security, Institute for European Studies )
  • Richard Moyes (Managing Director: Article 36)
  • Dr Peter Burt: Chair: (Researcher: Drone Wars UK)

Click here to register for the event and further details 

UK Campaign to Stop Killer Robots writes to Defence Secretary on the UK’s approach to LAWS

Guardian report of Gen Sir Nick Carter’s comments on UK’s increasing use of autonomous and remotely controlled machines.

As members of the UK Campaign to Stop Killer Robots, Drone Wars and a number of other UK civil society groups have written to Secretary of State Ben Wallace on the UK’s position on the development of Lethal Autonomous Weapon Systems partly in response to recent comments by the Chief of the Defence Staff.

Dear Secretary of State,

We are writing on behalf of the UK Campaign to Stop Killer Robots, in advance of the next meeting of the Group of Governmental Experts (GGE) on ‘Lethal Autonomous Weapons Systems’ (LAWS) at the Convention on Certain Conventional Weapons (CCW), as well as the CCW’s meeting of High Contracting Parties. We welcome the UK government’s recognition in the CCW that discussing human control is central to successful international work to address increasing ‘autonomy’ in weapons systems, and that this is an area in which meaningful progress can be made.[1]  Read more