Online meeting 29th November, 7pm: ‘Cyborg Dawn? The military use of human augmentation’

 

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from a science fiction film. Yet research projects investigating all these possibilities are underway in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

Drone Wars UK and Scientists for Global Responsibility (SGR) are holding this online event to mark the publication of ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation, in order to increase understanding of the possibilities and hazards posed by human enhancement technology.

Speakers:

  • Peter Burt: Peter, a long-time researcher and campaigner on peace and human rights issues, authored the ‘Cyborg Dawn’ report. At Drone Wars UK he primarily works on issues relating to artificial intelligence and autonomy and their role in the future development of drones. Peter is also a Trustee the Nuclear Education Trust.
  • Ben Taylor-Green: Ben was awarded his DPhil from the University of Oxford in early 2023. His doctoral thesis, Empathic Predators: On the Affects and Optics of Brain-Computer Interface unmanned Aerial Vehicle Research is a pioneering philosophical anthropological inquiry concerning the dual use problem in international brain-computer interface (BCI) research.
  • Helen Close (Chair): Helen, a member of Drone Wars UK Steering Committee, is a Research Associate at the Omega Research Foundation, an NGO that researches the manufacture, trade in, and use of conventional arms and law enforcement equipment. She has worked at Omega since 2009 and works on number of issues including researching the manufacture of specific weapons of concern. Helen is a trustee of the Trust for Research and Education on the Arms Trade.

To attend this online event register here.

Click to view report

 

The UK, accountability for civilian harm, and autonomous weapon systems

Second evidence session. Click to watch video

The second public session of the House of Lords inquiry into artificial intelligence (AI) in weapon systems took place at the end of March.  The session examined how the development and deployment of autonomous weapons might impact upon the UK’s foreign policy and its position on the global stage and heard evidence from Yasmin Afina, Research Associate at Chatham House, Vincent Boulanin, Director of Governance of Artificial Intelligence at the Stockholm International Peace Research Institute, and Charles Ovink, Political Affairs Officer at United Nations Office for Disarmament.

Among the wide range of issues covered in the two-hour session was the question of who could be held accountable if human rights abuses were committed by a weapon system acting autonomously.  A revealing exchange took place between Lord Houghton, a former Chief of Defence Staff (the most senior officer of the UK’s armed forces), and Charles Ovink.  Houghton asked whether it might be possible for an autonomous weapon system to comply with the laws of war under certain circumstances (at 11.11 in the video of the session):

“If that fully autonomous system has been tested and approved in such a way that it doesn’t rely on a black box technology, that constant evaluation has proved that the risk of it non-complying with the parameters of international humanitarian law are accepted, that then there is a delegation effectively from a human to a machine, why is that not then compliant, or why would you say that that should be prohibited?”

This is, of course, a highly loaded question that assumes that a variety of improbable circumstances would apply, and then presents a best-case scenario as the norm.  Ovink carefully pointed out that any decision on whether such a system should be prohibited would be for United Nations member states to decide, but that the question posed ‘a big if’, and it was not clear what kind of test environment could mimic a real-life warzone with civilians present and guarantee that the laws of war would be followed.  Even if this was the case, there would still need to be a human accountable for any civilian deaths that might occur.  Read more

Lords Committee on AI in Weapons Systems: AI harms, humans vs computers, and unethical Russians

First evidence session. Click to watch video

A special investigation set up by the House of Lords is now taking evidence on the development, use and regulation of artificial intelligence (AI) in weapon systems.  Chaired by crossbench peer Lord Lisvane, a former Clerk of the House of  Commons, a stand-alone Select Committee is considering the utility and risks arising from military uses of AI.

The committee is seeking written evidence from members of the public and interested parties, and recently conducted the first of its oral evidence sessions.  Three specialists in international law, Noam Lubell of the University of Essex, Georgia Hinds, Legal Advisor at International Committee of the Red Cross (ICRC), and Daragh Murray of Queen Mary University of London, answered a variety of questions about whether autonomous weapon systems might be able to comply with international law and how they could be controlled at the international level.

One of the more interesting issues raised during the discussion was the point that, regardless of military uses, AI has the potential to wreak a broad range of harms across society, and there is a need to address this concern rather than racing on blindly with the development and roll-out of ever more powerful AI systems.  This is a matter which is beginning to attract wider attention.  Last month the Future of Life Institute published an open letter calling for all AI labs to immediately pause the training of AI systems more powerful than GPT-4 for at least six months.  Over 30,000 researchers and tech sector workers have signed the letter to date, including Stuart Russell, Steve Wozniak, Elon Musk, and Yuval Noah Harari.

Leaving aside whether six months could be long enough to resolve issues around AI safety, there is an important question to be answered here.  There are already numerous examples of cases where existing computerised and AI systems have caused harm, regardless of what the future might hold.  Why, then, are we racing forward in this field?  Has the combination of tech multinationals and unrestrained capitalism become such an unstoppable juggernaut that humanity is literally no longer able to control where the forces we have created are taking us?  If not, then why won’t governments intervene to put the brakes on the development and use of AI, and what interests are they actually working to protect?  This is unlikely to be a line of inquiry the Lords Committee will be pursuing.  Read more

Latest update shows UK drones spreading across air, land and sea

We’ve updated our directory of current UK aerial drones and drone development programmes and wanted to highlight that, while drones have been mainly the preserve of the Air Force, they are now increasingly being acquired and used by the British Army and the Royal Navy.  Meanwhile, although the MoD is keen to point to the imminent arrival of its new armed drone, which they have dubbed ‘The Protector’, problems lie ahead.

Protector problems ahead

The replacement for the UK’s Reaper drone – dubbed ‘the Protector’ by the UK but called SkyGuardian by the manufacturer (and everyone else really) –  is supposed to be in service by mid-2024.  While the first aircraft from the production line has been delivered to the RAF it remains in the US for on-going testing and training.  However, two significant problems need to be addressed over the next 18 months before these drones become operational.

Firstly, recruitment and retention of personnel to operate the drones has been an on-going problem as Sir Stephen Lovegrove, then MoD permanent secretary, told the Commons public accounts committee in 2020.  This is likely to be even more so now as crews will be based permanently in Lincoln rather than having the option of being deployed to the sunnier climes of Las Vegas, after the UK shut down its US-based drone operations.

General Atomics promotional graphic visualising Protector flying over London

The RAF partly overcame recruitment issues by drafting in Royal Australian Air Force (RAAF) pilots.  As the RAAF  was set to purchase SkyGuardian drones it made sense to the RAAF to send pilots to operate UK armed drones as they would then get training and experience of using these systems before their drones arrived in Australia.  However in April 2022, Australia abruptly cancelled its planned purchase of SkyGuardian drones due to budget problems following the setting up of AUKUS alliance and the plan to build new nuclear submarines.  Given this, it seems likely the RAAF will not be so keen to provide personnel for the UK’s drone programme for much longer. Read more

Public consultation on allowing armed drones to fly from RAF Waddington opened – have your say!

Above us only….drones?

The Civil Aviation Authority (CAA) has formally opened a public consultation on the Ministry of Defence (MoD) proposal to change airspace regulations around RAF Waddington to allow armed Protector drones to operate from the base from 2023. In short, these changes will put in place  a ‘danger area’ around Waddington to allow the drones to take-off and land.

Currently the UK’s fleet of armed Reaper drones are not permitted to fly within the UK as they were not built to appropriate standards.  However the MoD argues that its new drone – called SkyGuardian by the manufacturer but labelled ‘Protector’ by the MoD – has been built to stricter construction standards that should allow it to be certified to fly within UK airspace. Separate from the construction issue is the very significant question as to whether large drones (military or otherwise) can fly safely in airspace alongside other aircraft. Drone advocates argue this can be done though using electronic ‘Detect and Avoid’ (DAA) equipment but this is as yet largely untried and untested.

Map of potentially affected area from CAA website

While this consultation is therefore limited in that it is focuses only on specific airspace changes around Waddington rather than wider questions about the safety of opening UK airspace to large drones, we would urge those concerned about these developments to respond via the dedicated webpage.  All members of the public are invited to respond and it should only take a few minutes.  The consultation is open until 30 November.  Read more

MoD to hold ‘duel of drones’ to choose new armed unmanned system

Artist conception of Loyal Wingman drones

The Ministry of Defence (MoD) will launch a series of competitions this autumn to progress the selection of an armed loyal wingman drone culminating in a duel between the two finalist – “an operational fly-off” as Sir Mike Wigston, Chief of Air Staff described it.  The initiative comes after the abrupt cancellation of Project Mosquito (to develop a loyal wingman drone technology demonstrator for the RAF)  earlier this summer.  The RAF’s Rapid Capabilities Office (RCO) will run the new process, open to both UK and international industry , and aimed at acquiring a “Mosquito type autonomous combat vehicle” after the Mosquito project itself was cancelled as it was not  thought able to achieve an operational drone within the desired timeframe.

Loyal Wingman

The concept of loyal wingman drones is for one or more to fly alongside, or in the vicinity of, a piloted military aircraft  – currently for the UK that would be  Typhoon and F-35, but in the future, Tempest – with the drones carrying out specific tasks such as surveillance, electronic warfare (i.e. radar jamming), laser guiding weapons onto targets, or air-to-air or air-to-ground strikes.   Rather than being directly controlled by an individual pilot on the ground as the UK’s current fleet of Reaper drones are, these drone fly autonomously, sharing data and information with commanders on the ground via the main aircraft.

In addition, loyal wingman drones are supposed to be cheap enough that they can be either entirely expendable or ‘attritable’ (that is not quite expendable, but cheap enough so that it is not a significant event if it is shot down or crashes).  However, Aviation International News, who spoke to an RCO insider, said that the focus would now centre on exploring a drone that fits somewhere between Category 1 (expendable airframes) and Category 2 (attritable airframes). According to the source, there is also a Category 3, which is survivable, indicating a larger airframe with stealth and other advanced technology and no doubt much more expensive.

Which drones will win out to take part in the ‘fly-off’ and come out on top as the UK’s loyal wingman drone is hard to predict, not least because the MoD’s criteria appears yet to be fixed.  However a few of the likely competitors are already emerging:  Read more