Loitering munitions are now hitting the headlines in the media as a result of their use in the Ukraine war. Vivid descriptions of ‘kamikaze drones’ and ‘suicide drones’ outline the way in which these weapons operate: they are able to find targets and fly towards them before crashing into them and exploding. Both Russia and Ukraine are deploying loitering munitions, which allow soldiers to fire on targets such as tanks and heavy armour without the predictability of a mortar or artillery round firing on a set trajectory. Under some circumstances these ‘fire and forget’ weapons may be able operate with a high degree of autonomy. For example they can programmed to fly around autonomously in a defined search area and highlight possible targets such as tanks to the operator. In these circumstances they can be independent of human control. This trend towards increasing autonomy in weapons systems raising questions about how they might shape the future of warfare and the morality of their use.
Loitering munitions such as these have previously been used to military effect in Syria and the 2020 Nagorno-Karabakh war. Although they are often described as drones, they are in many ways more like a smart missile than an uncrewed aircraft. Loitering munitions were first developed in the 1980s and can be thought of as a ‘halfway house’ between drones and cruise missiles. They differ from drones in that they are expendable, and unlike cruise missiles, have the ability to loiter passively in the target area and search for a target. Potential targets are identified using radar, thermal imaging, or visual sensor data and, to date, a human operator selects the target and executes the command to destroy the target. They are disposable, one-time use weapons intended to hunt for a target and then destroy it, hence their tag as ‘kamikaze’ weapons. Dominic Cummings, former chief advisor to the Prime Minister describes a loitering munition as a “drone version of the AK-47: a cheap anonymous suicide drone that flies to the target and blows itself up – it’s so cheap you don’t care”.
There has been speculation among open-source analysts and in the media that loitering munitions operating in autonomous modes may have been used as lethal autonomous weapons on the battlefield in Ukraine. For example, based on an image originally posted on Telegram, an article in the Bulletin of the Atomic Scientists has claimed that “Russia may have used a killer robot in Ukraine”, and Wired magazine has published a story headlined “Russia’s Killer Drone in Ukraine Raises Fears About AI in Warfare”. The Telegram image apparently shows a damaged Russian-made KYB-BLA loitering munition, manufactured by ZALA Aero – a subsidiary of Kalashnikov Group – which is said to have been photographed in the Podil neighbourhood of Kyiv and uploaded to Telegram on 12 March. KYB-BLA loitering munitions may also have been used in conflict in Idlib in north-west Syria in 2019. According to ZALA Aero the drone has an artificial intelligence (AI)-driven object detection and recognition system which is able to perform “real-time recognition and classification of detected objects”.
However, this does not as a consequence mean that Russian forces have used ‘killer robots’ in combat. Public domain knowledge about the autonomous capabilities of weapons such at the KYB-BLA is vague and based on manufacturer’s claims which are almost certainly exaggerated. The weapon can be operated in autonomous mode or under human control, and as the Bulletin of the Atomic Scientists accepts, there is as yet no evidence to indicate whether the crashed airframe photographed in Kyiv was operated in an autonomous mode. Writing for the Center for Strategic and International Studies, Gregory C. Allen has pointed out that ZALA Aero’s artificial intelligence visual identification system has been developed for use in the industrial and agricultural sectors rather than for military applications, and that there are formidable challenges in ‘training’ it to identify military targets. ZALA Aero’s vague claim that target coordinates can be acquired from sensor payload targeting imagery merely restates how many other precision-guided munitions and loitering munitions work, including ones that do not use advanced AI capabilities.
There have also been claims that loitering munitions, operating autonomously, may have been used in combat in Libya. These are based on a United Nations Expert Panel on Libya report to the UN Security Council which stated that a Turkish-manufactured Kargu-2 drone had “hunted down and remotely engaged” Libyan National Army logistics convoys and retreating forces during conflict in 2020. The report states the munitions “were programmed to attack targets without requiring data connectivity between the operator and the munition.” Again, this appears to be based manufacturer’s claims about the capability of their systems rather than clear evidence that the munition is able to operate autonomously. The Turkish government has denied that the Kargu-2 was used autonomously although it has stated that the weapon does have the capability to operate autonomously.
While these two examples show that media reports that lethal autonomous systems have been used in conflict need to be treated with caution, at the same time it is also clear that we are beginning to see a new generation of weapon systems being deployed which are showing a trend towards decreasing levels of human control. Artificial intelligence, combined with the use of sophisticated sensors, is allowing increasing levels of autonomy to be assigned to complex weapon systems. At present a human operator is able to approve an attack using these weapons, but the requirement for human approval can easily be removed with minor technical upgrades to the system.
Russian ‘killer robots’ have seized the headlines, both warring parties in the Ukraine – Russia conflict are using loitering munitions. Ukraine appears to be using the Polish-manufactured Warmate drone, which can operate conventionally as a surveillance drone or be equipped with explosives to become a loitering munition.
Manufacturer video showing how its WARMATE loitering munition works
The US has also sent Ukraine more than 700 AeroVironment Switchblade drones – tube-launched weapons weighing around 3 kg that can be carried by a single soldier and can loiter for up to 15 minutes before hitting a target up to 10 kilometres away. The US has also promised to send Puma reconnaissance drones, counter-drone systems, and naval drones to Ukraine, as well as the ‘Phoenix Ghost Tactical Unmanned Aerial System’. Information about the Phoenix Ghost drone has not been released to the public, but it is reported to be a loitering munition which requires minimal training to be effective. The drone was developed by California-based Aevex Aerospace “for a set of requirements that very closely match what Ukrainians need right now in the Donbas,” according to Pentagon spokesperson John Kirby. 121 Phoenix Ghost drones are to be provided to Ukraine, and reportedly have similar capabilities to the Switchblade drone with a longer loitering capability and infra-red sensors.
Military aid provided by the UK government to Ukraine also includes loitering munitions, although at the time of writing no information had been published about the types of munitions or their capabilities.
Loitering munitions and other military systems based around artificial intelligence are becoming more widely used in conflict. Stanford University’s annual AI Index report for 2022 highlights some of the trends which are facilitating the development of increasingly autonomous weapon systems. Firstly, research into AI and robotics is accelerating and becoming more accessible. The number of AI patents filed has soared and is now more than 30 times higher globally than it was in 2015. At the same time AI’s adoption in society, industry, government and business is also accelerating rapidly. AI is also becoming more affordable and higher performing. The cost of training an image classification system has decreased by 63.6% and training times have improved by 94.4% since 2018.
What can we conclude from the use of loitering munitions and AI-based systems in the Ukraine war and other recent conflicts? Importantly, this does not mean that autonomous weapon systems outside human control are now being routinely used in warfare, nor that the use of such systems in conflict is inevitable. Neither does it mean that the Russians are taking the lead in developing AI-based weapons, or that the West needs to develop equivalent weapons of its own to ‘keep up with’ the Russians, Chinese, or any other military rivals.
However, it does mean that modern weapon systems are becoming increasingly autonomous in their capabilities, and this poses risks. It means that action is urgently needed to introduce arms control measures on autonomous weapons systems, including a ban on systems which use target profiles that represent people or cannot be meaningfully controlled by humans.
Although we are unlikely to see nightmare visions of swarms of ‘killer robots’ realized in the Ukraine war, weapons with some degree of autonomy are being deployed by both parties to the conflict. This is creating pressure towards normalising the development of autonomous military systems and AI-based weapons and bringing the development of weapon systems that operate outside human control a step closer. There is only a limited amount of time available to call a halt to the slide towards the development of ‘killer robots’, and action is needed now.