Cyborg Dawn?  Human-machine fusion and the future of warfighting

Click to open report

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from in a science fiction film.  Yet research projects investigating all these possibilities are under way in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

In order to help in understanding the possibilities and hazards posed by human enhancement technology, Drone Wars UK is publishing ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation.

Human enhancement –  a medical or biological intervention to the body designed to improve performance, appearance, or capability beyond what is necessary to achieve, sustain or restore health – may lead to fundamentally new concepts of warfare and can be expected to play a role in enabling the increased use of remotely operated and uncrewed systems in war.

Although military planners are eager to create ‘super soldiers’, the idea of artificially modifying humans to give them capabilities beyond their natural abilities presents significant moral, legal, and health risks.  The field of human augmentation is fraught with danger, and without stringent regulation, neurotechnologies and genetic modification will lead us to an increasingly dangerous future where technology encourages and accelerates warfare.  The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force.  There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners intend to develop.  Read more

New trials of AI-controlled drones show push towards ‘killer robots’ as Lords announces special inquiry

General Atomics Avenger controlled by AI in trial

Two recently announced trials of AI-controlled drones dramatically demonstrates the urgent need to develop international controls over the development and use of lethal autonomous weapon systems known as ‘killer robots’.

In early January, the UK Ministry of Defence (MoD) announced that a joint UK-US AI taskforce had undertaken a trial of its ‘AI toolbox’ during an exercise on Salisbury Plain in December 2022.  The trial saw a number of Blue Bear’s Ghost drones controlled by AI which was updated during the drone’s flight. The experiments said the MoD, “demonstrated that UK-US developed algorithms from the AI Toolbox could be deployed onto a swarm of UK UAVs and retrained by the joint AI Taskforce at the ground station and the model updated in flight, a first for the UK.”  The trials were undertaken as part of the on-going US-UK Autonomy and Artificial Intelligence Collaboration (AAIC) Partnership Agreement.  The MoD has refused to give MPs sight of the agreement.

Two weeks later, US drone manufacturer General Atomics announced that it had conducted flight trials on 14 December 2022 where an AI had controlled one of its large Avenger drones from the company’s own flight operations facility in El Mirage, California.

Blue Bear Ghost drones in AI in trail on Salisbury Plain

General Atomics said in its press release that the AI “successfully navigated the live plane while dynamically avoiding threats to accomplish its mission.” Subsequently, AI was used to control both the  drone and a ‘virtual’ drone at the same time in order to “collaboratively chase a target while avoiding threats,” said the company.  In the final trial, the AI “used sensor information to select courses of action based on its understanding of the world state. According to the company, “this demonstrated the AI pilot’s ability to successfully process and act on live real-time information independently of a human operator to make mission-critical decisions at the speed of relevance.”

Drone Wars UK has long warned that despite denials from governments on the development of killer robots, behind the scenes corporations and militaries are pressing ahead with testing, trialling and development of technology to create such systems. As we forecast in our 2018 report ‘Off the Leash’ armed drones are the gateway to the development of lethal autonomous systems.  Whiles these particular trials will not lead directly to the deployment of lethal autonomous systems, byte-by-byte the building blocks are being put in place.

House of Lords Special Committee

Due to continuing developments in this area we were pleased to learn that the House of Lords voted to accept Lord Clement-Jones’ proposal for a year-long inquiry by a special committee to investigate the use of artificial intelligence in weapon systems.  We will monitor the work of the Committee throughout the year but for now here is the accepted proposal in full:  Read more

Future War: The Shape of Things to Come

A day conference of workshops, discussion and debate on the impact new technologies
will have on future conflicts – and the challenges facing peace activists.

While terrible wars currently rage in Ukraine, Yemen, Ethiopia and elsewhere, preparations for future wars using new technologies is also underway.

New technology can be a spur for great social change, offering tremendous possibilities.  However, innovations in artificial intelligence, robotics, autonomous systems and biotechnology are also being used in the military and security realms in ways which will directly and indirectly affect global peace and security. Scrutiny of these developments and building towards peaceful ways to solve political conflicts in ways which do not threaten people and the environment is crucial.

This open public conference organised by Drone Wars and CND  will bring together expert speakers and campaigners to discuss these developments and debate how we can work together to challenge wars today and in the future.

Book your free tickets here 

Supported by Scientists for Global Responsibility, UK Campaign to Stop Killer Robots, Peace News and others.  Read more

Webinar: ‘For Heaven’s Sake: Examining the UK’s Militarisation of Space’

Click to open

Tuesday 23rd August 2022, 7pm.

Drone Wars UK and CND are co-hosting a webinar to examine the UK’s militarisation of space.  The webinar builds on the briefing the organisations co-published in June (right).

 Speakers

Dr Jill Stuart is an academic based at the London School of Economics and Political Science. She is an expert in the politics, ethics and law of outer space exploration and exploitation. She is a frequent presence in the global media on the issue and regularly gives lectures around the world.

Dave Webb is former Chair of CND and long-time peace campaigner. He has played a leading role in CND’s work on missile defence. He is a member of the Drone Wars Steering Committee and co-author of the new report ‘Heavens Above: Examining the UK’s Militarisation of Space.

Bruce Gagnon is founder and Coordinator of the Global Network Against Weapons & Nuclear Power in Space. He is author of numerous  articles on the issue as well as a regular speaker at conferences and meetings. He is an active member of Veterans for Peace.

Chair

Dr Kate Hudson is General Secretary of CND. She has held that post since September 2010, having previously been Chair of the campaign since 2003. She is a leading anti-nuclear and anti-war campaigner and author of CND at 60: Britain’s Most Enduring Mass Movement.

 

Although the UK’s space programme began in 1952, until recently it has had very limited impact. However, as the commercial space sector has expanded and the cost of launches has decreased, the UK government is now treating space as an area of serious interest. Over the past two years we have seen the setting up of UK Space Command, the publication of a Defence Space Strategy outlining how the MoD will “protect the UK’s national interests in space” and the announcement of a portfolio of programmes for developing space assets and infrastructure. Over the summer of 2022, the UK MoD plans its first UK space launch from the UK.

Concerns include a spiralling space ‘arms race’; the environmental impact both on earth and in space, and the risk of  an accident sparking an armed confrontation.

Tickets for the webinar are free and can be booked at the Eventbrite page here.

 

 

New briefing: For Heaven’s Sake – Examining the UK’s Militarisation of Space

Click to open

Drone Wars UK’s new briefing, published in collaboration with the Campaign for Nuclear Disarmament (CND), looks at the UK’s emerging military space programme and considers the governance, environmental, and ethical issues involved.

Space based operations affect many aspects of modern life and commerce.  The global economy relies heavily on satellites in orbit to provide communication services for a variety of services including mobile phones, the internet, television, and financial trading systems. Global positioning system (GPS) satellites play a key role in transport networks, while earth observation satellites provide information for weather forecasting, climate monitoring, and crop observation.

Space is also, unfortunately, a key domain for military operations. Modern military engagements rely heavily on space-based assets. Space systems are used for command and control globally; surveillance, intelligence and reconnaissance; missile warning; and in support of forces deployed overseas.  Satellites also provide secure communications links for military and security forces, including communications needed to fly armed drones remotely.  Many precision-guided munitions use information provided by space-based assets to correct their positioning in order to hit a target.

The falling cost of launching small satellites is driving a new ‘race for space’, with many commercial and government actors keen to capitalise on the economic and strategic advantages offered by the exploitation of space. However this is creating conditions for conflict. Satellite orbits are contested and space assets are at risk from a variety of natural and artificial hazards and threats, including potential anti-satellite capabilities.  Satellite systems are defenceless and extremely vulnerable and losing an important satellite could have severe consequences. The loss of a key military or dual use satellite (such as one used for early warning of missile attack) – through an accident, impact of debris or a meteorite, technical failure, or a cyber-attack or similar on critical ground-based infrastructure – at a time of international tension could inadvertently lead to a military exchange, with major consequences.  Read more

None too clever? Military applications of artificial intelligence

Drone Wars UK’s latest briefing looks at where and how artificial intelligence is currently being applied in the military context and considers the legal and ethical, operational and strategic risks posed.

Click to open

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society.  Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function.  However, like all technologies, AI also poses risks if it is poorly understood, unregulated, or used in inappropriate or dangerous ways.

In current AI applications, machines perform a specific task for a specific purpose.  The umbrella term ‘computational methods’ may be a better way of describing such systems, which fall far short of human intelligence but have wider problem-solving capabilities than conventional software.  Hypothetically, AI may eventually be able to perform a range of cognitive functions, respond to a wide variety of input data, and understand and solve any problem that a human brain can.  Although this is a goal of some AI research programmes, it remains a distant  prospect.

AI does not operate in isolation, but functions as a ‘backbone’ in a broader system to help the system achieve its purpose.  Users do not ‘buy’ the AI itself; they buy products and services that use AI or upgrade a legacy system with new AI technology.  Autonomous systems, which are machines able to execute a task without human input, rely on artificial intelligence computing systems to interpret information from sensors and then signal actuators, such as motors, pumps, or weapons, to cause an impact on the environment around the machine.  Read more