Fine words, Few assurances: Assessing new MoD policy on the military use of Artificial Intelligence

Drone Wars UK is today publishing a short paper analysing the UK’s approach to the ethical issues raised by the use of artificial intelligence (AI) for military purposes in two recently policy documents.  The first part of the paper reviews and critiques the Ministry of Defence’s (MoD’s) Defence Artificial Intelligence Strategy published in June 2022, while the second part considers the UK’s commitment to ‘responsible’ military artificial intelligence capabilities, presented in the document ‘Ambitious, Safe, Responsible‘  published alongside the strategy document.

What was once the realm of science fiction, the technology needed to build autonomous weapon systems is currently under development by in a number of nations, including the United Kingdom.  Due to recent advances in unmanned aircraft technology, it is likely that the first autonomous weapons will be a drone-based system.

Drone Wars UK believes that the development and deployment of AI-enabled autonomous weapons would give rise to a number of grave risks, primarily the loss of human values on the battlefield.  Giving machines the ability to take life crosses a key ethical and legal Rubicon.  Lethal autonomous drones would simply lack human judgment and other qualities that are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.

In the short term it is likely that the military applications of autonomous technology will be in low risk areas, such logistics and the supply chain, where, proponents argue, there are cost advantages and minimal implications for combat situations.  These systems are likely to be closely supervised by human operators.  In the longer term, as technology advances and AI becomes more sophisticated, autonomous technology is increasingly likely to become weaponised and the degree of human supervision can be expected to drop.

The real issue perhaps is not the development of autonomy itself but the way in which this milestone in technological development is controlled and used by humans.  Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities.   These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing autonomous weapons systems. 

Despite the seeming inevitability of autonomous weapon systems there are a range of measures which could be used to prevent their development, such as establishing international treaties and norms, developing confidence-building measures, introducing international legal instruments, and adopting unilateral control measures.  Drone Wars UK takes the view that the UK should be fully involved in developing these measures on the international stage.

However, at this point in time, the government seemingly wishes to keep its options open, often arguing that it does not want to create barriers which might hinder underlying research into AI and robotics.  Nonetheless, plenty of controlled technologies, such as encryption, or in the area of nuclear, biological and chemical science, can be used for civil or military purposes and are controlled without stifling underlying research.

While recognising the ethical and practical hazards of AI, the MoD’s approach to tackling these hazards is conservative, unambitious, and lacking in commitment.  The MoD apparently seems to think it has now ‘ticked a box’ on its path towards implementing AI technologies: it can now say that it has an ethical approach to AI, and use this as a get-out-of-jail-free card when it is challenged on its applications of AI.  However, close scrutiny of the MoD’s AI strategy documents raises serious questions about its approach to the governance of AI and digital technologies.  It also points to a deep conflict between the government’s stated democratic values rooted in human rights on the one hand, and a technocratic impulse to race forward with AI at all costs on the other.

Leave a Reply