Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from in a science fiction film. Yet research projects investigating all these possibilities are under way in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.
In order to help in understanding the possibilities and hazards posed by human enhancement technology, Drone Wars UK is publishing ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation.
Human enhancement – a medical or biological intervention to the body designed to improve performance, appearance, or capability beyond what is necessary to achieve, sustain or restore health – may lead to fundamentally new concepts of warfare and can be expected to play a role in enabling the increased use of remotely operated and uncrewed systems in war.
Although military planners are eager to create ‘super soldiers’, the idea of artificially modifying humans to give them capabilities beyond their natural abilities presents significant moral, legal, and health risks. The field of human augmentation is fraught with danger, and without stringent regulation, neurotechnologies and genetic modification will lead us to an increasingly dangerous future where technology encourages and accelerates warfare. The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force. There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners intend to develop.
Although much of the published literature about human enhancement programmes relates to work undertaken in the US, the UK government is taking an interest in the field. Human enhancement is listed as a key technology area in the Defence Technology Framework and the document lists mechanical aids (such as wearable or implant-assisted exoskeletons and robotic prostheses), synthetic biology and gene editing, human-machine interfaces, and transcranial stimulation as areas of interest. The Defence Science and Technology Laboratory has undertaken research projects to investigate military applications of customized nutrition, modification of the gut biome, and machine vision enabled multi-sensory augmented reality. In 2021 the Ministry of Defence’s Development, Concepts and Doctrine Centre (DCDC) along with the German Bundeswehr Office for Defence Planning published ‘Human Augmentation – The Dawn of a New Paradigm’ – a guidance note intended to “set the foundation for more detailed research and development on human augmentation” which urges greater investigation and investment in military applications in this field.
Over recent years the US military has had an active programme of research into human augmentation technologies, led by the Defense Advanced Research Projects Agency (DARPA). One area of interest is wearable exoskeletons – machines that can be worn over the human body to provide structural support and increase strength and endurance. The US Air Force has recently tested the use of an exoskeleton designed to help porters load cargo onto aircraft and handle heavier loads with less risk of injury and fatigue, and the US Army has recently issued a request for information from military contractors to identify potential suppliers of a wearable exoskeleton to enhance soldier performance, with the aim of eventually building a full ‘warrior suit’.
Pharmacological enhancement using natural or synthetic chemicals to improve cognitive function or to promote relaxation is another area of research. Potential innovations include the development of ‘smart drugs’ targeted to improve specific aspects of physical and cognitive performance. Researchers are also investigating the potential for using physical, rather than chemical methods, to influence the brain and central nervous system, for instance by using electromagnetic radiation. DARPA is keen to develop non-invasive brain-machine interfaces that do not require surgery, and ultimately the goal is to develop brain-computer interfaces that can be put on and taken off like a helmet or headset, with no surgery required. Military scientists hope that eventually the speed of human-machine interaction can be improved by merging the human and the machine, using machinery controlled directly by the mind through a brain-computer interface to accelerate performance. DARPA research programmes into military applications of brain-computer implants have been used to allow people to control prosthetic limbs and direct basic flight manoeuvres in aircraft.
A ‘Cyborg Soldier 2050‘ study prepared by the US Army identified four specific uses of cyborg human augmentation technology which the authors consider will be technically feasible by 2050 or earlier. These are ocular enhancements to imaging, sight, and situational awareness; restoration and programmed muscular control through an optogenetic bodysuit sensor web; auditory enhancement for communication and ear protection; and direct neural enhancement of the human brain for two-way data transfer. The study concluded that the development of such technologies would “fundamentally alter the battlefield by the year 2050”.
Despite the active research programmes in this field, the ethical implications of human enhancement are profound, extending to the boundaries of the acceptable and raising far-reaching questions about the relationship between humans and machines and the very nature of what it is to be human. Regulation is complicated by the wide range of technologies involved in human augmentation, but a good place to start is controlling uses of neurotechnologies, which probably pose the greatest security risks. The simplest option for this would be to extend the terms of the Biological Weapons Convention to cover neurotechnologies; another possibility would be to commence negotiations on a new neurosciences treaty. Scientists and academic institutions also have professional responsibilities to ensure that their work is not misused for hostile purposes, and an approach for improved governance in the neurosciences, based around the principles of responsibility, precaution, and engagement has been advocated by the Royal Society.
The justifications for taking military human augmentation forward are as familiar as they are tired – ‘if we don’t do it, others will’; ‘technological change is inevitable’, and the need to ‘maintain our competitive advantage’. Advocates of the technology argue that we are already using a human augmentation technique when vaccinating against an infectious disease, and that research in the field will lead to medical and humanitarian advances. However, medical applications can easily be weaponised – by rogue actors as well as states – and it is not clear how this can be prevented from happening. The field of human augmentation is fraught with danger, and without stringent regulation neurotechnologies and genetic modification look set to lead us to a dystopian future where technology reflects and encourages the worst aspects of human nature through warfighting.