Online meeting 29th November, 7pm: ‘Cyborg Dawn? The military use of human augmentation’

 

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from a science fiction film. Yet research projects investigating all these possibilities are underway in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

Drone Wars UK and Scientists for Global Responsibility (SGR) are holding this online event to mark the publication of ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation, in order to increase understanding of the possibilities and hazards posed by human enhancement technology.

Speakers:

  • Peter Burt: Peter, a long-time researcher and campaigner on peace and human rights issues, authored the ‘Cyborg Dawn’ report. At Drone Wars UK he primarily works on issues relating to artificial intelligence and autonomy and their role in the future development of drones. Peter is also a Trustee the Nuclear Education Trust.
  • Ben Taylor-Green: Ben was awarded his DPhil from the University of Oxford in early 2023. His doctoral thesis, Empathic Predators: On the Affects and Optics of Brain-Computer Interface unmanned Aerial Vehicle Research is a pioneering philosophical anthropological inquiry concerning the dual use problem in international brain-computer interface (BCI) research.
  • Helen Close (Chair): Helen, a member of Drone Wars UK Steering Committee, is a Research Associate at the Omega Research Foundation, an NGO that researches the manufacture, trade in, and use of conventional arms and law enforcement equipment. She has worked at Omega since 2009 and works on number of issues including researching the manufacture of specific weapons of concern. Helen is a trustee of the Trust for Research and Education on the Arms Trade.

To attend this online event register here.

Click to view report

 

MoD abruptly ends responding to FoI requests on UK drone operations

MoD goes dark on Reaper operations

After more than a decade of responding to our requests for statistical information about the use of armed drones, the MoD abruptly refused our January 2023 request.  At first we thought it may be an administrative error but it soon became clear that it was a policy decision.

The MoD now argues that the information that it had previously released without any suggestion of harm for more than a decade will now be refused due to the data being related to bodies dealing with security matters (Section 23) and/or national security (Section 24).  Last week, the Information Commissioner finally responded to our appeal, upholding the MoD’s decision based on “confidential submissions” from the MoD.  The Information Commissioners Office wrote:

“Based on submissions provided to him by the MOD during the course of his investigation, the Commissioner is satisfied that the information sought by questions 1b) to 4 of the request either falls within the scope of the exemption provided by section 23(1) of FOIA or falls within the scope of the exemption provided by section 24(1) of FOIA, and that if the exemption engaged is section 24(1) then the public interest favours maintaining the exemption.

The Commissioner cannot elaborate on his rationale behind this finding without compromising the content of the withheld information itself or by revealing which of these two exemptions is actually engaged. The Commissioner appreciates that this is likely to prove frustrating to the complainant. However, the Commissioner would like to emphasise that he has carefully scrutinised the MOD’s submissions and that in doing so he has taken into account the complainant’s position that such information has been previously disclosed.

Transparency thwarted
Summary table of data from FoI requests. Click for full details

Our analysis of the statistical data on UK armed drone operations, gathered via these FoI requests, enabled some understanding of UK’s use of its armed drones over the past decade or more.  Asking a specific set of questions every quarter over several years gave us a dataset which enabled a degree of transparency.

For example when the MoD insisted that its armed drones were primarily used for intelligence gathering – strongly implying drones were rarely being used to launch strikes – the data showed that for periods of time between half and a third of UK air strikes were being carried out by drones.  The data also enabled greater understanding of what was happening on the ground, for example the extent of the switch of UK air and drone strikes to Syria from Iraq in certain periods of the war.  The FoI data also enabled us to discover when strikes had taken place that had not been disclosed by MoD press releases. The data also allowed us to see the number and type of weapons being used and therefore to calculate the cost of the munitions being used. Indeed one response also let slip that the UK drones were firing thermobaric weaponsRead more

New ‘Protector’ armed drones to begin flying in UK – Join the protest on 13 November

 

The UK’s new armed drones – known as  ‘SkyGuardian’ internationally, but renamed ‘Protector’ by the UK –  will begin test flights in the UK next month after the Civil Aviation Authority (CAA) agreed to new airspace rules around RAF Waddington.  The MoD will undertake “a small number of time-critical proving flights” of the new drone ahead of a longer test and training programme due to begin in late 2023/early 2024.

The first of an initial batch of sixteen MQ-9B SkyGuardian was flown into RAF Waddington on-board a transport aircraft on 30 September.

According to Jane’s:

While the Protector fleet will be based at and operated from RAF Waddington, it will spend most of its time overseas in the same manner as the Reaper fleet. A future operational scenario could see the Protector ferry itself from RAF Waddington to a location in the Middle East or Sub-Saharan Africa, arriving in theatre to be met by a team that would arm and prep it for its mission.”

The UK is replacing its fleet of ten Reaper drones with up to 26 of the new ‘Protector’ drones.  The newer drone has further range and longer endurance, as well as being capable of carrying more weapons.  It is also capable of autonomous take-off and landing.

Why we continue to challenge the use of armed drones

As we have argued over the past decade, while remote-controlled drones are presented as enabling ‘pinpoint’ accurate air strikes which enable us to ‘take out’ bad guys without risk to our own forces, the reality is somewhat different. While the UK continues to claim that only one civilian was killed in the thousands of British air and drone strikes in Iraq and Syria, journalists and casualty recording organisations have reported large numbers of civilian casualties from US and UK air strikes.  In addition, as they can be deployed with no or few boots on the ground, making it much easier for political leaders to choose to use armed force.

Armed drones have also enabled a huge increase in so-called ‘targeted killing’ including killing of individuals far from battle zone.  While some argue that it the policy of targeted killing that is problematic, it is hard to deny that the practice has hugely increased with the advent of armed drones. While the US is at the forefront of such operations, the UK too has used its drones to carry out a number of such killings including the killing of a suspected ISIS leader in Syria in December 2022Read more

Developments on both sides of the Atlantic signal push to develop AI attack drones

Artist impression of crewed aircraft operating with autonomous drones. Credit: Lockheed Martin

Recent government and industry announcements signal clear intent by both the US and the UK to press ahead with the development of a new generation of AI attack drones despite serious concerns about the development of autonomous weapons. While most details are being kept secret, it is clear from official statements, industry manoeuvring and budget commitments that these new drones are expected to be operational by the end of the decade.

The current focus is the development of drones that were previously labelled as ‘loyal wingman’ but are now being described either as ‘Collaborative Combat Aircraft (CCA) or ‘Autonomous Collaborative Platforms (ACP).  As always, the nomenclature around ‘drones’ is a battlefield itself.  The concept for this type of drone is for one or more to fly alongside, or in the vicinity of, a piloted military aircraft with the drones carrying out specifically designated tasks such as surveillance, electronic warfare, guiding weapons onto targets, or carrying out air-to-air or air-to-ground strikes.  Rather than being directly controlled by individual on the ground such as current armed drones like the Reaper or Bayraktar, these drones will fly autonomously. According to DARPA officials (using the beloved sports metaphor) these drone will allow pilots to direct squads of unmanned aircraft “like a football coach who chooses team members and then positions them on the field to run plays.”

Next Generation

In May, the US Air Force issued a formal request for US defence companies to bid to build a new piloted aircraft to replace the F-22.  However, equally important for the ‘Next Generation Air Dominance (NGAD)’ program is the development of new autonomous drones and a ‘combat cloud’ communication network.  While the development of the drones is a covert programme, US Air Force Secretary Frank Kendall said they will be built “in parallel” to the piloted aircraft. Kendall publicly stated that the competition to develop CCA was expected to begin in Fiscal Year 2024 (note this runs from Oct 2023 to Sept 2024).

While it is planning to build around 200 of the new crewed aircraft, Kendall told reporters that the USAF is expecting to build around 1,000 of the drones. “This figure was derived from an assumed two CCAs per 200 NGAD platforms and an additional two for each of 300 F-35s for a total of 1,000,” Kendall explained. Others expect even more of these drones to be built.  While the NGAD fighter aircraft itself is not expected to be operational until the 2030s, CCA’s are expected to be deployed by the end of the 2020’s.

It’s important to be aware that there will not be one type of drone built under this programme, but a range with different capabilities able to carry out different tasks.  Some of them will be ‘expendable’ – that is designed for just one mission – something like the ‘one-way attack’ drones that we have seen increasing used in Ukraine and elsewhere; some will be ‘attritable’, that is designed that if they are lost in combat it would not be severely damaging, while others, described as ‘exquisite’ will be more capable and specifically designed not to be lost during combat.  A number of companies have set out their proposals, with some even building prototypes and undertaking test flights. Read more

Online Webinar: ‘Scotland Says Keep Space for Peace’ Tues 19th September, 7pm


Over the past decade, both the US and UK governments have designated space as a key focus as military operations increasingly rely on space-based assets for command and control, surveillance, targeting, missile warning and secure communications with forces deployed overseas.

In 2019 the Ministry of Defence (MoD) declared that space should be seen as “a war fighting domain” and over the past two years we have seen the setting up of UK Space Command, the publication of a UK Defence Space Strategy outlining how the MoD will “protect the UK’s national interests in space” and the announcement of a portfolio of new military programmes to develop space-based military assets. Incredibly, both the US and UK are also exploring the use of nuclear propulsion for their space systems.

The UK is now in the process of developing a number of UK spaceports – including in Shetland, the Western Isles and Sutherland – from where we will see both commercial and military space launches. We are entering an era of military space expansion by the UK which will inevitably lead to environmental harm and risk of instability and conflict.

Join Scottish CND and Space Watch UK/Drone Wars to examine how new Scottish spaceports are at the heart of UK government plans to militarise space and what we can do to challenge it.

We will follow up this online event with an in-person protest outside Scottish Parliament on Tuesday 3rd

Speakers:

• Dr Jill Stuart is an academic based at the London School of Economics and Political Science. She is an expert in the politics, ethics and law of outer space exploration and exploitation. She is a frequent presence in the global media on the issue and regularly gives lectures around the world.

• Dave Webb is former Chair of CND, Convenor of the Global Network Against Weapons and Nuclear Power in Space and co-author of the new report ‘Heavens Above: Examining the UK’s Militarisation of Space.

• George Gunn grew up in the far north of Scotland and now lives in Thurso. He is a poet and has written over fifty productions for stage and radio and has produced several series for BBC Radio Scotland and Radio 4. He has a regular column ‘From the Province of the Cat’ in Bella Caledonia.

• Timothy Parker a recent graduate of the University of Reading, researched the development of UK spaceports and the implications of AI for the UK’s nuclear weapons programme as part of an internship for Drone Wars UK.

• Tor Justad is Chair of Highlands Against Nuclear Transport (HANT). He believes in the words of Margaret Mead: “Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.”

Chair:

• Lynn Jamieson is an academic and Chair of Scottish CND.

 

Tickets for the webinar are free and can be booked at the Eventbrite page here.

 

 

Military AI: MoD’s timid approach to challenging ethical issues will not be enough to prevent harm

Papers released to Drone Wars UK by the Ministry of Defence (MoD) under the Freedom of Information Act reveal that progress in preparing ethical guidance for Ministry of Defence (MoD) staff working on military artificial intelligence (AI) projects is proceeding at a snail’s pace.  As a result, MoD’s much vaunted AI strategy and ethical principles are at risk of failing as the department races ahead to develop AI as a key military technology.

Minutes of meetings of MoD’s Ethical Advisory Panel show that although officials have repeatedly stressed the need to focus on implementation of AI programmes, the ethical framework and guidelines needed to ensure that AI systems are safe and responsible are still only in draft form and there is “not yet a distinct sense of a clear direction” as to how they will be developed.

The FOI papers also highlight concerns about the transparency of the panel’s work.  Independent members of the panel have repeatedly stressed the need for the panel to work in an open and transparent manner, yet MoD refuses to publish the terms of membership, meeting minutes, and reports prepared for the panel.  With the aim of remedying this situation, Drone Wars UK is publishing the panel documents released in response to our FOI request as part of this blog article (see pdf files at the end of the article).

The Ministry of Defence AI Ethics Advisory Panel

One of the aims of the Defence Artificial Intelligence Strategy, published in June 2022, was to set out MoD’s “clear commitment to lawful and ethical AI use in line with our core values”.  To help meet this aim MoD published a companion document, entitled ‘Ambitious, safe, responsible‘ alongside the strategy to represent “a positive blueprint for effective, innovative and responsible AI adoption”.

‘Ambitious, safe, responsible’ had two main foundations: a set of ethical principles to guide MoD’s use of AI and an Ethics Advisory Panel, described as “an informal advisory board” to assist with policy relating to the safe and responsible development and use of AI.  The document stated that the panel had assisted in formulating the ethical principles and listed the members of the panel, who are drawn from within Ministry of Defence and the military, industry, and universities and civil society.

The terms of reference for the panel were not published in the ‘Ambitious, safe, responsible’ document, but the FOI papers provided to Drone Wars UK show that it is tasked with advising on:

  • “The development, maintenance and application of a set of ethical principles for AI in Defence, which will demonstrate the MOD’s position and guide our approach to responsible AI across the department.
  • “A framework for implementing these principles and related policies / processes across Defence.
  • “Appropriate governance and decision-making processes to assure ethical outcomes in line with the department’s principles and policies”.

The ethical principles were published alongside the Defence AI Strategy, but more than two years after the panel first met – and despite a constant refrain at panel meetings on the need to focus on implementation – it has yet to make substantial progress on the second and third of these objectives.  An implementation framework and associated policies and governance and decision-making processes have yet to appear.  This appears in no way to be due to shortcomings on behalf of the panel, who seem to have a keen appetite for their work, but rather is the result of slow progress by MoD.  In the meantime work is proceeding at full speed ahead on the development of AI systems in the absence of these key ethical tools.

The work of the panel

The first meeting of the panel, held in March 2021, was chaired by Stephen Lovegrove, the then Permanent Secretary at the Ministry of Defence.  The panel discussed the MoD’s work to date on developing an AI Ethics framework and the panel’s role and objectives.  The panel was to be a “permanent and ongoing source of scrutiny” and “should provide expert advice and challenge” to MoD, working through a  regular quarterly meeting cycle.  Read more