Online Event – 25 March: Meaning-less human control: Lessons from air defence systems for LAWS

Together with Center for War Studies of University of Southern Denmark, we are hosting an online event on Thursday 25 March at 2pm (GMT) to discuss our co-published report, Meaning-less human control: Lessons from air defence systems for Lethal Autonomous Weapon Systems (LAWS).

In recent years, autonomous weapons systems have increasingly come to the attention of the international community. Debates on these weapon systems centre on whether they reduce meaningful human control over the use of force.  This event will discuss our latest report with an expert panel:

  • Dr Ingvild Bode (Associate Professor of International Relations: Centre for War Studies, University of Southern Denmark)
  • Maaike Verbruggen TBC (Doctoral Researcher: International Security, Institute for European Studies )
  • Richard Moyes (Managing Director: Article 36)
  • Dr Peter Burt: Chair: (Researcher: Drone Wars UK)

Click here to register for the event and further details 

Meaning-less human control: Lessons from air defence systems for lethal autonomous weapons

Click to open report

A new report co-published today by Drone Wars UK and the Centre for War Studies; University of Southern Denmark examines the lessons to be learned from the diminishing human control of air defence systems for the debate about lethal autonomous weapons systems (LAWS) – ‘Killer Robots’ as they are colloquially called.

In an autonomous weapons system, autonomous capabilities are integrated into critical functions that relate to the selection and engagement of targets without direct human intervention. Subject expert Professor Noel Sharkey, suggests that a Lethal Autonomous Weapon System can be defined as “systems that, once activated, can track, identify and attack targets with violent force without further human intervention”. Examples of such systems include BAE Systems’ Taranis drone, stationary sentries such as the Samsung Techwin SGR-A1, and ground vehicles such as the Kalashnikov Concern Uran-9.

Air Defence Systems are an important area of study in relation to the development of LAWS as, they are already in operation and, while not completely autonomous due to having a human operator in control, they have automated and increasingly autonomous features. Vincent Boulanin and Maaike Verbruggen’s study for the Stockholm International Peace Research Institute (SIPRI) estimates that 89 states operate air defence systems. These includes global military powers such as the US, the UK, France, Russia, and China but also regional powers such as Brazil, India, and Japan.  Read more

Humans First: A Manifesto for the Age of Robotics. A review of Frank Pasquale’s ‘New Laws of Robotics’

In 2018, the hashtag #ThankGodIGraduatedAlready began trending on China’s Weibo social media platform.  The tag reflected concerns among Chinese students that schools had begun to install the ‘Class Care System’, developed by the Chinese technology company Hanwang.  Cameras monitor pupils’ facial expressions with deep learning algorithms identifying each student, and then classifying their behaviour into various categories – “focused”, “listening”, “writing”, “answering questions”, “distracted”, or “sleeping”. Even in a country where mass surveillance is common, students reacted with outrage.

There are many technological, legal, and ethical barriers to overcome before machine learning can be widely deployed in such ways but China, in its push to overtake the US as world’s leader in artificial intelligence (AI), is racing ahead to introduce such technology before addressing these concerns.  And China is not the only culprit.

Frank Pasquale’s book ‘The New Laws of Robotics: Defending Human Expertise in the Age of AI’ investigates the rapidly advancing use of AI and intelligent machines in an era of automation, and uses a wide range of examples – among which the ‘Class Care System’ is far from the most sinister – to highlight the threats that the rush to robotics poses for human societies.  In a world dominated by corporations and governments with a disposition for centralising control, the adoption of AI is being driven by the dictates of neoliberal capitalism, with the twin aims of increasing profit for the private sector and cutting costs in the public sector.  Read more

UK Campaign to Stop Killer Robots writes to Defence Secretary on the UK’s approach to LAWS

Guardian report of Gen Sir Nick Carter’s comments on UK’s increasing use of autonomous and remotely controlled machines.

As members of the UK Campaign to Stop Killer Robots, Drone Wars and a number of other UK civil society groups have written to Secretary of State Ben Wallace on the UK’s position on the development of Lethal Autonomous Weapon Systems partly in response to recent comments by the Chief of the Defence Staff.

Dear Secretary of State,

We are writing on behalf of the UK Campaign to Stop Killer Robots, in advance of the next meeting of the Group of Governmental Experts (GGE) on ‘Lethal Autonomous Weapons Systems’ (LAWS) at the Convention on Certain Conventional Weapons (CCW), as well as the CCW’s meeting of High Contracting Parties. We welcome the UK government’s recognition in the CCW that discussing human control is central to successful international work to address increasing ‘autonomy’ in weapons systems, and that this is an area in which meaningful progress can be made.[1]  Read more

XLUUVs, Swarms, and STARTLE: New developments in the UK’s military autonomous systems

Behind the scenes, the UK is developing a range of military autonomous systems. Image: Crown Copyright

In November 2018 Drone Wars UK published ‘Off The Leash’, an in-depth research report outlining how the Ministry of Defence (MoD) was actively supporting research into technology to support the development of armed autonomous drones despite the government’s public claims that it “does not possess fully autonomous weapons and has no intention of developing them”.  This article provides an update on developments which have taken place in this field since our report was published, looking both at specific technology projects as well as developments on the UK’s policy position on Lethal Autonomous Weapons Systems (LAWS). Read more

US Reaper drones test Agile Condor: Another step closer to ‘Killer Robots’

General Atomics Aeronautical Systems, manufacturer of the Reaper drone, has recently been awarded a US Air Force contract to demonstrate the  ‘Agile Condor’ artificial intelligence system with the MQ-9 Reaper drone.  According to General Atomics President David R. Alexander,

“The Agile Condor project will further enhance RPA [remotely piloted aircraft] effectiveness by specifically allowing a MQ-9 to surveil a large area of operations, autonomously identify pre-defined targets of interest and transmit their locations.”

This type of capability represents a tangible step further towards the development of autonomous weaponised drones able to operate without human input – flying killer robots, in other words.  From identifying targets without the need for a human decision to destroying those targets is a very small step which could be achieved with existing technology. Read more