Future Wars monitors and scrutinises the development and use of novel military technologies, particularly by the UK government. We want to ensure that the development and use of such technology always obeys international humanitarian law and international human rights laws, and complies with clear ethical principles which respect human life.
New technology is often a spur for social change, offering tremendous possibilities. Over recent decades we have seen, for example, how information technology has vastly increased the spread of knowledge and understanding. However, new technologies often find military applications which, while they could be positive in that they could increase precaution in the application of force, often provide more powerful capabilities of harm and destruction.
Current innovations in artificial intelligence, robotics, autonomous systems, and biotechnology, are expected to bring social transformations on an unprecedented scale. However, these technologies are also being used in the military and security realms in ways which are not yet fully understood by the public. The capabilities they provide will directly and indirectly affect global peace and security, the nature of armed conflicts and how insecurity is managed. Scrutiny of these developments and explaining them in an accessible way to decision makers and the general public is crucial to prevent increased humanitarian harm during armed conflict.
‘Future Wars’ will critically examine the development of new military technologies such as hypersonic weapons, directed energy weapons, human enhancement and autonomous weapons. Our research investigates the humanitarian impact of these emerging military technologies as well as their likely impact on peace and security.
Future Wars Briefings
1. Speed kills: The Growing Threat from Hypersonic Weapons
Behind the scenes, arms companies and military powers are quietly developing a new class of weapon system that uses speed to project deadly force. Through travelling at extreme speed, hypersonic weapons can strike targets anywhere in the world in a very short period of time.
While these weapons are mostly at the development stage, once deployed they could introduce great instability and threaten global peace and security, particularly at times of crisis. A nation under attack would be unable to tell where a hypersonic missile is going, or whether it carries a nuclear warhead, creating a significant risk of misunderstanding and escalation. The speed of hypersonic weapons would dangerously narrow the time available for working out the nature of an attack and making a reasoned decision on how to respond, and would create ‘use it or lose it’ pressure on nations to strike first.
This briefing, the first in a series published by Drone Wars UK as part of our ‘Future Wars’ project, examines the development of hypersonic weapons, the UK’s involvement, and the risks they pose to peace and security.
None too clever? Military applications of artificial intelligence
Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society. Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function. However, like all technologies, AI also poses risk if it is poorly understood, unregulated, or used in inappropriate or dangerous ways. As well as transforming homes and businesses, AI is seen by the world’s military powers as a way to revolutionise warfare and gain an advantage over enemies. Military applications of AI have entered everyday use over the past couple of decades and new systems with worrying characteristics are rapidly being rolled out.
This briefing is part of our ‘Future Wars’ project. It is an abridged version of a longer report, with both examining the military applications of AI and describes how the UK’s military is beginning to adopt AI technologies, before going on to outline the various risks associated with them. The full version is available here.