Future Wars monitors and scrutinises the development and use of novel military technologies, particularly by the UK government. We want to ensure that the development and use of such technology always obeys international humanitarian law and international human rights laws, and complies with clear ethical principles which respect human life.
New technology is often a spur for social change, offering tremendous possibilities. Over recent decades we have seen, for example, how information technology has vastly increased the spread of knowledge and understanding. However, new technologies often find military applications which, while they could be positive in that they could increase precaution in the application of force, often provide more powerful capabilities of harm and destruction.
Current innovations in artificial intelligence, robotics, autonomous systems, and biotechnology, are expected to bring social transformations on an unprecedented scale. However, these technologies are also being used in the military and security realms in ways which are not yet fully understood by the public. The capabilities they provide will directly and indirectly affect global peace and security, the nature of armed conflicts and how insecurity is managed. Scrutiny of these developments and explaining them in an accessible way to decision makers and the general public is crucial to prevent increased humanitarian harm during armed conflict.
‘Future Wars’ will critically examine the development of new military technologies such as hypersonic weapons, directed energy weapons, human enhancement and autonomous weapons. Our research investigates the humanitarian impact of these emerging military technologies as well as their likely impact on peace and security.
Future Wars Briefings
1. Speed kills: The Growing Threat from Hypersonic Weapons

Behind the scenes, arms companies and military powers are quietly developing a new class of weapon system that uses speed to project deadly force. Through travelling at extreme speed, hypersonic weapons can strike targets anywhere in the world in a very short period of time.
While these weapons are mostly at the development stage, once deployed they could introduce great instability and threaten global peace and security, particularly at times of crisis. A nation under attack would be unable to tell where a hypersonic missile is going, or whether it carries a nuclear warhead, creating a significant risk of misunderstanding and escalation. The speed of hypersonic weapons would dangerously narrow the time available for working out the nature of an attack and making a reasoned decision on how to respond, and would create ‘use it or lose it’ pressure on nations to strike first.
This briefing, the first in a series published by Drone Wars UK as part of our ‘Future Wars’ project, examines the development of hypersonic weapons, the UK’s involvement, and the risks they pose to peace and security.
-
None too clever? Military applications of artificial intelligence

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society. Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function. However, like all technologies, AI also poses risk if it is poorly understood, unregulated, or used in inappropriate or dangerous ways. As well as transforming homes and businesses, AI is seen by the world’s military powers as a way to revolutionise warfare and gain an advantage over enemies. Military applications of AI have entered everyday use over the past couple of decades and new systems with worrying characteristics are rapidly being rolled out.
This briefing is part of our ‘Future Wars’ project. It is an abridged version of a longer report, with both examining the military applications of AI and describes how the UK’s military is beginning to adopt AI technologies, before going on to outline the various risks associated with them. The full version is available here.
3. For Heaven’s Sake: Examining the Militarisation of Space

This new abridged briefing, published as part of our Future Wars series, looks at the UK’s emerging military space programme and considers the governance, environmental, and ethical issues involved.
Space based operations affect many aspects of modern life and commerce. The global economy relies heavily on satellites in orbit to provide communication services for a variety of services including mobile phones, the internet, television, and financial trading systems. Global positioning system (GPS) satellites play a key role in transport networks, while earth observation satellites provide information for weather forecasting, climate monitoring, and crop observation. Space is also, unfortunately, a key domain for military operations. Modern military engagements rely heavily on space-based assets. Space systems are used for command and control globally; surveillance, intelligence and reconnaissance; missile warning; and in support of forces deployed overseas. Satellites also provide secure communications links for military and security forces, including communications needed to fly armed drones remotely. Many precision-guided munitions use information provided by space-based assets to correct their positioning in order to hit a target.
4. Fine words, Few assurances: Assessing new MoD policy on the military use of AI
Our latest publication, Fine Words, Few Assurances, analyses the UK’s approach to the ethical issues raised by the use of artificial intelligence (AI) for military purposes in two recently policy documents. The first part of the paper reviews and critiques the Ministry of Defence’s (MoD’s) Defence Artificial Intelligence Strategy published in June 2022, while the second part considers the UK’s commitment to ‘responsible’ military artificial intelligence capabilities, presented in the document ‘Ambitious, Safe, Responsible‘ published alongside the strategy document.
What was once the realm of science fiction, the technology needed to build autonomous weapon systems is currently under development by in a number of nations, including the United Kingdom. Due to recent advances in unmanned aircraft technology, it is likely that the first autonomous weapons will be a drone-based system.
Drone Wars UK believes that the development and deployment of AI-enabled autonomous weapons would give rise to a number of grave risks, primarily the loss of human values on the battlefield. Giving machines the ability to take life crosses a key ethical and legal Rubicon. Lethal autonomous drones would simply lack human judgment and other qualities that are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.
The real issue perhaps is not the development of autonomy itself but the way in which this milestone in technological development is controlled and used by humans. Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities. These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing autonomous weapons systems.
4. Cyborg Dawn? The military use of human augmentation for war fighting
Drone Wars UK’s latest publication, Cyborg Dawn? investigates the military use of human augmentation for war fighting.
Human enhancement – a medical or biological intervention to the body designed to improve performance, appearance, or capability beyond what is necessary to achieve, sustain or restore health – may lead to fundamentally new concepts of warfare and can be expected to play a role in enabling the increased use of remotely operated and uncrewed systems in war.
Although military planners are eager to create ‘super soldiers’, the idea of artificially modifying humans to give them capabilities beyond their natural abilities presents significant moral, legal, and health risks. The field of human augmentation is fraught with danger, and without stringent regulation, neurotechnologies and genetic modification will lead us to an increasingly dangerous future where technology encourages and accelerates warfare. The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force. There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners intend to develop.