None too clever? Military applications of artificial intelligence

Drone Wars UK’s latest briefing looks at where and how artificial intelligence is currently being applied in the military context and considers the legal and ethical, operational and strategic risks posed.

Click to open

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society.  Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function.  However, like all technologies, AI also poses risks if it is poorly understood, unregulated, or used in inappropriate or dangerous ways.

In current AI applications, machines perform a specific task for a specific purpose.  The umbrella term ‘computational methods’ may be a better way of describing such systems, which fall far short of human intelligence but have wider problem-solving capabilities than conventional software.  Hypothetically, AI may eventually be able to perform a range of cognitive functions, respond to a wide variety of input data, and understand and solve any problem that a human brain can.  Although this is a goal of some AI research programmes, it remains a distant  prospect.

AI does not operate in isolation, but functions as a ‘backbone’ in a broader system to help the system achieve its purpose.  Users do not ‘buy’ the AI itself; they buy products and services that use AI or upgrade a legacy system with new AI technology.  Autonomous systems, which are machines able to execute a task without human input, rely on artificial intelligence computing systems to interpret information from sensors and then signal actuators, such as motors, pumps, or weapons, to cause an impact on the environment around the machine.  Read more

The iWars Survey: Mapping the IT sector’s involvement in developing autonomous weapons

A new survey by Drone Wars has begun the process of mapping the involvement of information technology corporations in military artificial intelligence (AI) and robotics programmes, an area of rapidly increasing focus for the military.  ‘Global Britain in a Competitive Age’, the recently published integrated review of security, defence, development, and foreign policy, highlighted the key roles that new military technologies will play in the government’s vision for the future of the armed forces and aspirations for the UK to become a “science superpower”.

Although the integrated review promised large amounts of public funding and support for research in these areas, co-operation from the technology sector will be essential in delivering ‘ready to use’ equipment and systems to the military.  Senior military figures are aware that ‘Silicon Valley’ is taking the lead in  the development of autonomous systems for both civil and military use’. Speaking at a NATO-organised conference aimed at fostering links between the armed forces and the private sector, General Sir Chris Deverell, the former Commander of Joint Forces Command explained:

“The days of the military leading scientific and technological research and development have gone. The private sector is innovating at a blistering pace and it is important that we can look at developing trends and determine how they can be applied to defence and security”

The Ministry of Defence is actively cultivating technology sector partners to work on its behalf through schemes like the Defence and Security Accelerator (DASA). However, views on co-operation with the military by those within the commercial technology sector are mixed. Over the past couple of  years there are been regular reports of opposition by tech workers to their employer’s military contacts including those at Microsoft and GoogleRead more

Online Event – 25 March: Meaning-less human control: Lessons from air defence systems for LAWS

Together with Center for War Studies of University of Southern Denmark, we are hosting an online event on Thursday 25 March at 2pm (GMT) to discuss our co-published report, Meaning-less human control: Lessons from air defence systems for Lethal Autonomous Weapon Systems (LAWS).

In recent years, autonomous weapons systems have increasingly come to the attention of the international community. Debates on these weapon systems centre on whether they reduce meaningful human control over the use of force.  This event will discuss our latest report with an expert panel:

  • Dr Ingvild Bode (Associate Professor of International Relations: Centre for War Studies, University of Southern Denmark)
  • Maaike Verbruggen TBC (Doctoral Researcher: International Security, Institute for European Studies )
  • Richard Moyes (Managing Director: Article 36)
  • Dr Peter Burt: Chair: (Researcher: Drone Wars UK)

Click here to register for the event and further details 

UK Campaign to Stop Killer Robots writes to Defence Secretary on the UK’s approach to LAWS

Guardian report of Gen Sir Nick Carter’s comments on UK’s increasing use of autonomous and remotely controlled machines.

As members of the UK Campaign to Stop Killer Robots, Drone Wars and a number of other UK civil society groups have written to Secretary of State Ben Wallace on the UK’s position on the development of Lethal Autonomous Weapon Systems partly in response to recent comments by the Chief of the Defence Staff.

Dear Secretary of State,

We are writing on behalf of the UK Campaign to Stop Killer Robots, in advance of the next meeting of the Group of Governmental Experts (GGE) on ‘Lethal Autonomous Weapons Systems’ (LAWS) at the Convention on Certain Conventional Weapons (CCW), as well as the CCW’s meeting of High Contracting Parties. We welcome the UK government’s recognition in the CCW that discussing human control is central to successful international work to address increasing ‘autonomy’ in weapons systems, and that this is an area in which meaningful progress can be made.[1]  Read more

XLUUVs, Swarms, and STARTLE: New developments in the UK’s military autonomous systems

Behind the scenes, the UK is developing a range of military autonomous systems. Image: Crown Copyright

In November 2018 Drone Wars UK published ‘Off The Leash’, an in-depth research report outlining how the Ministry of Defence (MoD) was actively supporting research into technology to support the development of armed autonomous drones despite the government’s public claims that it “does not possess fully autonomous weapons and has no intention of developing them”.  This article provides an update on developments which have taken place in this field since our report was published, looking both at specific technology projects as well as developments on the UK’s policy position on Lethal Autonomous Weapons Systems (LAWS). Read more

Off the Leash: How the UK is developing the technology to build armed autonomous drones

Click to open

A new report published by Drone Wars UK reveals that, despite a UK government statement that it “does not possess fully autonomous weapons and has no intention of developing them”, the Ministry of Defence (MoD) is actively funding research into technology supporting the development of armed autonomous drones.

Our study, Off the Leash: The Development of Autonomous Military Drones in the UK, identifies the key technologies influencing the development of future armed drones and looks at current initiatives which are under way in the UK to marry developments in autonomy – the ability of a machine to operate with limited, or even no, human control – with military drone technology. The report maps out the agencies, laboratories, and contractors undertaking research into drones and autonomous weapon technology in support of the Ministry of Defence, examines the risks arising from the weaponisation of such technologies, and assesses government policy in this area. Read more