Smoke and mirrors: FoI shows there appears little substance to the RAF’s drone swarm squadron

Artist’s impression of aircraft launching swarming drones

14 July 2022 – updated below

A Freedom of Information (FoI) investigation has found that an RAF Squadron set up more than two years ago to take the lead on developing the UK’s swarming drones capability has yet to undertake any testing or trialling of drones and currently has just four personnel assigned to it. The revelation raises serious questions about Ministry of Defence (MoD) statements that it is rapidly progressing towards operational use of swarming drones.

While the MoD initially refused to answer any questions about 216 Squadron’s inventory or testing/trialling of drones, arguing the commercial confidentially mean that such details could not be released, this was overturned after an appeal and the MoD admitted that “216 Squadron has not conducted any UAV tests since it was reactivated on 1 April 2020.”

Background

Early in 2019, as part of his ‘Defence in Global Britain’ speech, then Defence Secretary Gavin Williamson announced plans to develop a new capability of swarming drones. “I have decided to use the Transformation Fund  [ring-fenced  funds to develop new military technology] to develop swarm squadrons of network enabled drones capable of confusing and overwhelming enemy air defences.”  Rather rashly, the Secretary of State went on to  declare “we expect to see these ready to be deployed by the end of the year.”    Read more

Drone Wars Select Committee submission on use of the military drones in countering migrant crossings

In Sept 2021 the prototype of the UK’s new armed drone flew from Scotland to undertake a mission involving a search pattern over the Channel.

Boris Johnson announced in mid-January that the armed forces was to take charge of limiting migrants crossing the English Channel. The announcement was described by The Times as one of a series of populist announcements by the embattled PM to save his premiership.

Soon after, the Defence Select Committee announced that it was to scrutinize the decision and sought submissions from interested parties:

“The Government’s decision that the Royal Navy should take over operations in the Channel has taken Parliament (and it seems the MOD) by surprise.  There are significant strategic and operational implications surrounding this commitment which need to be explored.”

Shockingly, both the Ministry of Defence and the Home Office refused to submit evidence or send ministers to answer questions from the Committee.

Our full submission to the Committee on this issue – looking in particular at how drones are often seen as a ‘solution’ – is available on their website, while here we offer a short summary.

  • Drone Wars argues that the military should not be involved in day-to-day border control operations in the absence of any threat of military invasion. This role is primarily a policing and enforcement role centred on dealing with civilians which should be conducted by civilian agencies.  Military forces are not principally trained or equipped to deal with humanitarian or policing situations.  The UK borders are not a war zone, and civilians attempting to enter and leave the country are not armed combatants.

Read more

None too clever? Military applications of artificial intelligence

Drone Wars UK’s latest briefing looks at where and how artificial intelligence is currently being applied in the military context and considers the legal and ethical, operational and strategic risks posed.

Click to open

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society.  Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function.  However, like all technologies, AI also poses risks if it is poorly understood, unregulated, or used in inappropriate or dangerous ways.

In current AI applications, machines perform a specific task for a specific purpose.  The umbrella term ‘computational methods’ may be a better way of describing such systems, which fall far short of human intelligence but have wider problem-solving capabilities than conventional software.  Hypothetically, AI may eventually be able to perform a range of cognitive functions, respond to a wide variety of input data, and understand and solve any problem that a human brain can.  Although this is a goal of some AI research programmes, it remains a distant  prospect.

AI does not operate in isolation, but functions as a ‘backbone’ in a broader system to help the system achieve its purpose.  Users do not ‘buy’ the AI itself; they buy products and services that use AI or upgrade a legacy system with new AI technology.  Autonomous systems, which are machines able to execute a task without human input, rely on artificial intelligence computing systems to interpret information from sensors and then signal actuators, such as motors, pumps, or weapons, to cause an impact on the environment around the machine.  Read more

Military applications at centre of Britain’s plans to be AI superpower

The UK government published its National AI Strategy in mid-September, billed as a “ten-year plan to make Britain a global AI superpower”.  Despite the hype, the strategy has so far attracted curiously little comment and interest from the mainstream media.  This is a cause for concern  because if the government’s proposals bear fruit, they will dramatically change UK society and the lives of UK Citizens.  They will also place military applications of AI at the centre of the UK’s AI sector.

The Strategy sets out the government’s ambitions to bring about a transition to an “AI-enabled economy” and develop the UK’s AI industry, building on a number of previously published documents – the 2017 Industrial Strategy and 2018 AI Sector Deal, and the ‘AI Roadmap‘ published by the AI Council earlier this year.  It sets out a ten year plan based around three ‘pillars’: investing in the UK’s AI sector, placing AI at the mainstream of the UK’s economy by introducing it across all economic sectors and regions of the UK, and governing the use of AI effectively.

Unsurprisingly, in promoting the Strategy the government makes much of the potential of AI technologies to improve people’s lives and solve global challenges such as climate change and public health crises – although making no concrete commitments in this respect.  Equally unsurprisingly it has far less to say up front about the military uses of AI.  However, the small print of the document states that “defence should be a natural partner for the UK AI sector” and reveals that the Ministry of Defence is planning to establishment a new Defence AI Centre, which will be “a keystone piece of the modernisation of Defence”, to champion military AI development and use and enable the rapid development of AI projects.  A Defence AI Strategy, expected to be published imminently, will outline how to “galvanise a stronger relationship between industry and defence”.  Read more

CAA opens UK skies to military drones

The Civil Aviation Authority (CAA) has granted permission to US drone company General Atomics to conduct experimental flights of its new SkyGuardian drone in UK airspace. The MoD is buying 16 SkyGuardian drones, but renaming them as ‘Protector’. This is the first time that large military drones will be allowed to fly in the UK outside of segregated airspace and the decision will be seen as a breakthrough by the drone industry, who will see it as the beginning of opening UK skies to a whole host of drones to fly ‘beyond visual line of sight’ (BVLOS).

The news came in an ‘airspace alert’ issued by the CAA following the announcement that temporary airspace rules were to be put in place around the bases where the drone will be based. The terse, one-sentence paragraph in the alert said:

“The CAA has also completed an in-depth review and issued the authorisation to General Atomics operate within the UK.”

The lack of detail reflects the lack of transparency about the process to allow General Atomics to use its largely untried and untested ‘Detect and Avoid’ (DAA) equipment in the flights.

General Atomics has developed its DAA equipment to supposedly replicate an on-board pilot’s ability to ‘see and avoid’ danger. This is the bedrock upon which all air safety measures are built and – as we reported back in 2018 – regulators at the CAA were deeply sceptical as to whether remote technology can replace an on-board pilot in busy airspace such as UK skies. Test flights of the drone in the US last summer, which were due to fly over San Diego, were routed away from city after apparent concerns from US safety regulators.  Read more

RAF drone programmes fly into stormy skies

BAe System image of Tempest aircraft with accompanying drones

Funding for the ‘Tempest’ Future Combat Air System which is intended to replace the RAF’s Typhoon aircraft is “significantly less than required” and “adds significant overall programme risk” to delivery of the new jet, according to a report on government project management published jointly by HM Treasury and the Cabinet Office.

In its first assessment of the Tempest programme the Infrastructure and Projects Authority (IPA), which reports jointly to the two government departments, reveals that successful delivery of the aircraft is already “in doubt”.  Another high profile drone project, delivery of the RAF’s new ‘Protector’ aircraft, rated a similar assessment.

Tempest is under joint development by Italy, Sweden, and the UK as the next generation combat aircraft for the three nations – a high performance, high cost system consisting of a core aircraft, which is likely to be able to fly in both crewed and uncrewed modes, with an associated network of swarming drones, sensors, and data systems.

The IPA, which each year rates the performance of government departments in delivering major projects, has scored the Future Combat Air System programme with an Amber / Red risk rating in its report for the 2020-21 financial year.  This means that “successful delivery of the project is in doubt, with major risks or issues apparent in a number of key areas. Urgent action is needed to address these problems and assess whether resolution is feasible”.  Read more