Lucky Dip: Drone companies await spending bonanza as Defence Investment Plan (DIP) to be revealed.   

Following the government’s commitment to increase military spending and the publication of the Strategic Defence Review (SDR) in early June, the military industry has been keenly awaiting the release of the government’s Defence Investment Plan (DIP) which will layout military spending plans and other details for the rest of this parliament. Numerous reports have indicated that many planned projects are ‘on hold’ until the plan is finalised and published.

UK Military Spending 2010/11 – 2024/25 – Statista

Defence minister Luke Pollard told MPs in June that the DIP will “cover the full scope of the defence programme, from people and operations to equipment and infrastructure”. Time and again ministers have promised that the plan will be unveiled in the autumn and so this now seems likely to be soon after the Budget of 26 November (although such promises are of course routinely broken).

How much?!

UK military spending was £60.2bn in 24/25 (around 2.4% of GDP), up from £42.4bn in 2020/21. In February 2025, the Starmer government committed to further increase military spending raising the budget to 2.5% of GDP by 2027 (estimated at around an extra £6bn per year – roughly the amount cut from the UK’s Aid budget) with ‘an ambition’ to reach 3% by the next parliament.  At the NATO summit in June 2025, however, Starmer upped the ante, with a pledge to reach a ‘goal’ of 5% (3.5% on ‘core defence’ (estimated to be an extra £30bn per year) with 1.5% (around £40bn per year) on ‘defence-related areas such as resilience and security’) by 2029. Subsequently the government said it “expected to reach at least 4.1% of GDP in 2027”.

‘Whole of Society’

Importantly, alongside the increase in military spending, the Strategic Defence Review argued that ‘defence’ is now to be seen as a ‘whole of society’ effort and this may well be re-emphasised when DIP is published.

The plan is being billed as enabling the UK to be at ‘warfighting readiness’ and alongside equipment and weapons programmes, the public is being urged to be ”prepared for conflict and ready to volunteer, support the military, and endure challenges”.

Plans already announced to ‘reconnect society with the military’ include the expansion of youth cadet forces, education work in schools to develop understanding among young people of the armed forces, and broader public outreach events to outline the threats and the need for greater military spending despite increased social challenges.

Government keen to ‘reconnect’ young people with the armed forces

And to top this off, the government is deploying the hoary old chestnut that military spending is good for the economy (despite such claims being persistently and thoroughly debunked).

Trailed Plans

While specific spending details remain under wraps, government announcements since the publication of the SDR have indicated some of the broad areas which will receive more funding:

Drones, Drones, Drones. In the Spring Statement, Chancellor Rachel Reeves stated that “a minimum of 10% of the MoD’s equipment budget is to be spent on novel technologies including drones and AI enabled technology.”  Defence Minister Alistair Cairns indicated in July that there would be around £4bn spending on uncrewed systems – ‘Drones, drones and drones‘ as he put it on twitter. 

To the ever-expanding list of UK drone development programmes, many of which are seeking funding decisions as part of the DIP, we can add Project Nyx which seeks to pair a new drone with the British Army’s Apache Helicopter. 

Perhaps most significantly in this area, publication of the Defence Investment Plan may illuminate UK plans for a ‘loyal wingman’ type drone  – now described by the MoD as an Autonomous Collaborative Platform (ACP) – to accompany the UK’s planned new fighter aircraft, Tempest. While some funding has already been allocated to develop smaller Tier 1 and 2 ACP’s, plans for the more strategic and no doubt costlier level Tier 3 drone have been placed on the back burner pending funding decisions.  Will the UK go it alone and build a new armed drone (as no doubt BAE Systems hopes) or will it buy Australia’s Ghost Bat or one of the two drones currently competing for the US contract?

Integrated targeting web. Alongside new drones, the UK is developing a ‘digital targeting web’ to link, as MoD-speak puts it,  ‘sensors’, ‘deciders’ and ‘effectors’.  In other words commanders supported by AI will be networked with ‘next generation’ drones, satellites and other systems to identify targets to be destroyed by a variety of novel and traditional military systems. The aim is to rapidly speed up the time between target identification and attack.  As Drone Wars has reported, several tests of various elements of this system (such as ASGARD) have been tested and it is likely that further funding for this programme will be part of the DIP.

New munition and drone factories.  The government is keen to bolster the UK’s munitions stocks after supplying huge amounts to Ukraine. The MoD accidentally released details of 12 potential sites for new munitions factories to The Ferret in a Freedom of Information mix-up.  The government has plans to open 6 new factories at a cost of £6bn,   

Helsing factory

Alongside this, there is also a desire to persuade some of the newer drone companies to open factories here in the UK. While Tekever has announced it will open a new site in Swindon, Anduril and Helsing seem to be keeping their power dry while awaiting news that they have secured government contracts before committing to setting up premises.  Both companies have, however, set up UK subsidiaries and have launched PR campaigns to persuade ministers and officials of the efficacy of their products.

While drones are key for these companies, a huge increase in UK spending on military AI systems is also in their sights.

An AI ‘Manhattan Project’ endeavour.  Despite continued and significant concerns about the military use of AI, particularly in ‘the kill chain’, ministers, officials and commanders seem convinced that a rapid integration of AI into all areas of the armed forces is urgent and vital.  Just before stepping down as Chief of the Defence Staff in September, Admiral Sir Tony Radakin put his weight behind calls from Helsing co-founder Gundbert Scherf for a “Manhattan-Project for AI defence”.  Arguing such a plan “would not cost the earth” (but putting it at around $90bn!) Scherf suggested four areas to concentrate on: a) masses of AI-enabled defensive drones deployed on NATO’s eastern flank;  b) deploying AI-enabled combat drones to dominate airspace; c) large scale deployment of ai-enabled underwater drones/sensors; and finally, d) replacing Europe’s ageing satellites with (you guessed it) ai-enabled surveillance and targeting satellites.

Anduril is also not shy of lobbying in its own interests. Anduril UK CEO Richard Drake told The House, Parliament’s in-house magazine, that Anduril US was “very much happy with the direction [the SDR is] taking” but went on to publicly push to reduce regulation on the use of drones in UK airspace:

“For UK PLC to get better and better and better in drones and autonomous systems, they have to always look at their regulatory rules as well. Companies like ours and other UK companies can design and build these really cool things, but if we can’t test them well enough in the UK, that’s going to be a problem.”

Winners and Losers

While wholesale adoption of Helsing’s plan seems unlikely, there seems little doubt that the new AI-focused military companies will be among the various military companies who will be the lucky beneficiaries of the UK’s DIP.  Meanwhile, the rest of us seem assured of spending cuts and tax rises.  

The Strategic Defence Review and Drone Warfare: Questioning a Dangerous Consensus

While there appears to be a consensus between mainstream political parties, officials and defence commentators that a significant increase in spending on drone and military AI systems would be a positive development, there are serious questions about the basis on which this decision is being made and the likely impact on global security.

New military technology in general, and uncrewed systems in particular, are being presented by politicians and the media as a quick and simple, cost-effective way for the armed forces to increase ‘mass’ and ‘lethality’ without having to procure hugely expensive kit that can take years to produce. Drones are also seen as an alternative to deploying troops in significant numbers at a time when recruitment has become increasingly difficult.

However, far from aiding security, increased spending on drones, autonomous weapons and other emerging military technology will simply lead to a further degrading of UK and global security. Remote and autonomous military systems lower the threshold for the use of armed force, making it much easier for state and non-state groups alike to engage in armed attack. Such systems encourage war as the first rather than the last option.

KEY QUESTIONS

Does the war in Ukraine really demonstrate that ‘drones are the future’?
  • It seems to be taken for granted that the ongoing war in Ukraine has demonstrated the effectiveness of drone and autonomous warfare and that therefore the UK must ‘learn the lesson’ and increase funding for such technology. However, while drones are being used extensively by both Russia and Ukraine – and causing very substantial numbers of casualties – it is far from clear that they are having any strategic impact.
  • Larger drones such as the Turkish Bayraktar TB2 operated by Ukraine – hailed as the saviour of  Ukraine at the beginning of the war  – and Russia’s Orion MALE armed drone have virtually disappeared above the battlefield as they are easily shot down. Larger one-way attack (sometimes called ‘suicide’) drones are being fired at each other’s major cities by both sides and are causing considerable harm. While these strikes are mainly for propaganda effect, again it is not clear if this will change the outcome of the war.
  • Short range surveillance/attack drones are being used very extensively on the battlefield, and the development in particular of First Person View (FPV) drones to carry out attacks on troops and vehicles has been a significant development. However, counter measures such as electronic jamming means that thousands of these drones are simply lost or crash. In many ways, drone warfare in Ukraine has become a long-term ‘cat and mouse’ fight between drones and counter-drone measures and this is only likely to continue.
Is ‘cutting edge military technology’ a silver bullet for UK Defence?
  • The capabilities of future military systems are frequently overstated and regularly underdelivered. Slick industry videos showcasing new weapons are more often than not the product of graphic designers creative imaginings rather than real world demonstrations of a new capability.
  • Click to open the briefing

    The hype surrounding trials of so-called ‘swarming drones’ is a good example. There is a world of difference between a ‘drone swarm’ in its true, techno-scientific meaning and a group of drones being deployed at the same time. A true drone swarm sees individual systems flying autonomously, communicating with each other and following a set of rules without a central controller. While manufacturers and militaries regularly claim they are testing or trialling ‘a drone swarm’, in reality they just operating a group of drones at the same time controlled by a group of operators.

  • While there have been considerable developments in the field of AI and machine learning over the past decade, the technology is still far from mature. Anyone using a chatbot, for  example, will quickly discover that there can be serious mistakes in the generated output. Trusting data generated by AI systems in a military context, without substantial human oversight and checking, is likely to result in very serious errors. The need for ongoing human oversight of AI systems is likely to render any financial of human resources saving from using AI virtually redundant.
Will funding new autonomous drones actually keep us safe?
  • Perhaps the key question about plans to heavily invest in future military AI and drone warfare is whether it will actually keep the UK safe. Just over a decade ago, armed drones were the preserve of just three states: the US, the UK and Israel. Today, many states and non-state groups are using armed drones to launch remote attacks, resulting in large numbers of civilian casualties. In essence, as they enable both states and non-state groups to engage in armed attack with little or no risk to themselves, remote and autonomous drones lower the threshold for the use of armed force, making warfare much more likely.
  • Given the global proliferation of such technology, it seems inevitable that any new developments in drone warfare funded by the UK over the next few years will inevitable proliferate and be used by other state and non-state groups. In many ways, it seems only a matter of time before drone warfare comes to the UK.
  • Rather than funding the development of new lethal autonomous drones, the UK should be at the forefront of efforts to curb and control the use of these systems, working with other states, NGOs and international experts to put in place globally accepted rules to control their proliferation and use.
Is the development and use of autonomous weapons inevitable?
  • Although the realm of science fiction until relatively recently, plans are now being developed by a number of states, including the UK, to develop and deploy lethal autonomous weapon systems. It is highly likely that the first fully autonomous weapons will be a drone-based system.
  • The real issue here is not the development of AI itself, but the way it is used. Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities. These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing AI weapons systems.
  • While some argue the inevitability of the development of these systems, there are a range of measures which could be used to prevent their development including establishing international treaties and norms, developing confidence-building measures, introducing international legal instruments, and adopting unilateral control measures. Given how much we have seen drone warfare spread and create global insecurity over the past decade, now is the time for the UK to be fully involved in international discussions to control the development of lethal fully autonomous weapon systems.

Read more

Watchkeeper drones scrapped – but will any lessons be learnt?

On Wednesday 20 November, the Defence Secretary John Healey announced in the House of Commons that the UK was scrapping its entire fleet of Watchkeeper drones bringing to an end to the sorry saga of this drone programme.

Anyone who has been following the Drone Wars blog over the last 14 years will be well aware of the failings of Watchkeeper. Based on the Israeli Hermes 450 drone, Watchkeeper was built by a joint venture company (U-TacS) owned 50-50 by the Israeli company Elbit Systems and Thales UK.

While the UK’s armed Reaper drones are operated by the Royal Air Force, the unarmed watchkeepers were bought for the British Army with the intention of undertaking short-medium range surveillance in order to conduct artillery strikes.

Fifty-four of the drones were built under a 2004, £1bn contract which was supposed to see “world-class” drones, operated by the army in Afghanistan by mid-2010.  Long delays saw only four of the drones deployed to Afghanistan for four weeks in 2014, conducting around 140 hours of surveillance as British forces withdrew from Camp Bastion.

Since then, apart from one short deployment in the UK, the 50 plus Watchkeeper drones have either flown on training flight, mostly in the UK or Cyprus (despite being marketed as an all-weather system, it performs poorly in ‘adverse’ weather) or simply kept in storage.  The UK deployment was to support Border Force operations to curb refugees crossings the channel. According to responses to our FoI requests at the time, a total of 21 flights were conducted in September and October 2022.

Crashes of Watchkeeper drones

Date Type Tail No. Where Source Status
Nov 10, 2022 Watchkeeper N/A United States Press Destroyed
May 29, 2022 Watchkeeper N/A Off Cyprus Press Destroyed
Oct 14, 2020 Watchkeeper WK044 Cyprus Press Unknown
Jun 13, 2018 Watchkeeper N/A Aberporth Press Destroyed
Mar 24, 2017 Watchkeeper N/A Irish Sea Press Destroyed
Feb 3, 2017 Watchkeeper N/A Irish Sea Press Destroyed
Nov 2, 2015 Watchkeeper WK006 Salisbury Press Destroyed
Oct 16, 2014 Watchkeeper WH031 Aberporth FoI Withdrawn

Watchkeeper crashes: Aberporth 2014; off Cyprus coast 2022

Military Technophilia

While it would be fair to say that there have been particular problems with the Watchkeeper programme, the dominant narrative that sophisticated military technology is the answer to a wide array of political and security issues is also an important element here.  Read more

MoD publish new UK ‘Drone Strategy’ and its embarrassing, superficial nonsense.

Click to open

The Ministry of Defence (MoD) finally published its long-promised strategy on UK plans to be “a world leader in defence uncrewed systems’ and to say its underwhelming would be an understatement.  The document – stripping out graphics, self-promotional photographs and the glossary –  runs to around four pages, much of which is filled with management speak that would make David Brent wince.   Apparently, through “Pan-Defence Excellence” the MoD will be “enshrining the principle of iterative – or spiral – capability development” to create a more “predictable demand signal.”

In a nutshell, the (ahem) ‘strategy’ seems to be: ‘learning from the war in Ukraine we will work even closer with the defence industry’.  The Minister for Defence Procurement James Cartlidge and the Commander of Strategic Command, General Sir Jim Hockenhull announced the strategy at a press event at Malloy Systems, the drone company recently taken over by BAE Systems.

The strategy document contains no details about timescales, programmes, spending or even categories of uncrewed systems that the MoD will be focusing on.  The closest the document comes to any information on future plans is a bullet point that says “the RAF is testing cost-effective expendable Autonomous Collaborative Platforms.”  Another bullet point argues that “the army has a long history of uncrewed systems and development.”  Pretty sure someone should have at least added the word ‘chequered’ in there.

Drone Strategy launched at Malloy Systems. Credit: BAE Systems

Sifting through this thin gruel we can pick out one or two points.

  • In his Introduction, Minister for Defence Procurement James Cartlidge argues “it is in the uncrewed space that we will increasingly drive the mass of our forces…” Drones, in other words, are seen as a way of increasing the size and lethality of UK armed forces as personnel recruitment slumps and spending on big-ticket items eats up the budget.
  • There is a recognition that drone warfare is “not only here to stay but likely to increase as technology expands opportunities for [drone] employment.” This is due to the fact, argues the document, that “inexpensive commercial and military technologies have democratised [drone] employment.”  Drone warfare, it is acknowledged,  is no longer the preserve of larger Western states.
  • The strategy suggests that the “initial priority is the successful delivery of the Ukraine-UK uncrewed systems initiative.” Given that the current use of drones in this conflict is primarily small, first person view (FPV) drones or one-way attack drones, it is likely that funding of  new UK developments will be in this area.   Whether that will be effective for UK security needs is questionable to say the least.
  • The decline in transparency and debate about the development, use, legality and efficacy of drone warfare from the government is likely to continue. While,  the document pays lip-service  to “the importance of public engagement” on these issues and insists it is  “committed … to keeping the public informed of our progress and developments”  these lofty aims are caveated with need to protect “necessary operational sensitivity” and the requirement to “balance transparency with security.”

All in all, it is likely this strategy document will be put on a shelf and quickly forgotten.

Proceed with caution: Lords warn over development of military AI and killer robots

Click to open report

The use of artificial intelligence (AI) for the purposes of warfare through the development of AI-powered autonomous weapon systems – ‘killer robots’ –  “is one of the most controversial uses of AI today”, according to a new report by an influential House of Lords Committee.

The committee, which spent ten months investigating the application of AI to weapon systems and probing the UK government’s plans to develop military AI systems, concluded that the risks from autonomous weapons are such that the government “must ensure that human control is consistently embedded at all stages of a system’s lifecycle, from design to deployment”.

Echoing concerns which Drone Wars UK has repeatedly raised, the Lords found that the stated aspiration of the Ministry of Defence (MoD) to be “ambitious, safe, responsible” in its use of AI “has not lived up to reality”, and that although MoD has claimed that transparency and challenge are central to its approach, “we have not found this yet to be the case”.

The cross-party House of Lords Committee on AI in Weapon Systems was set up in January 2023 at the suggestion of Liberal Democrat peer Lord Clement-Jones, and started taking evidence in March.    The committee heard oral evidence from 35 witnesses and received nearly 70 written evidence submissions, including evidence from Drone Wars UK.

The committee’s report is entitled ‘Proceed with Caution: Artificial Intelligence in Weapon Systems’ and ‘proceed with caution’ gives a fair summary of its recommendations.  The panel was drawn entirely from the core of the UK’s political and military establishment, and at times some members appeared to have difficulty in grasping the technical concepts underpinning the technologies behind autonomous weapons.  Under the circumstances the committee was never remotely likely to recommend that the government should not commit to the development of new weapons systems based on advanced technology, and in many respects its report provides a road-map setting out the committee’s views on how the MoD should go ahead in integrating AI into weapons systems and build public support for doing this.

Nevertheless, the committee has taken a sceptical view of the advantages claimed for autonomous weapons systems; has recognised the very real risks that they pose; and has proposed safeguards to mitigate the worst of the risks alongside a robust call for the government to “lead by example in international engagement on regulation of AWS [autonomous weapon systems]”.  Despite hearing from witnesses who argued that autonomous weapons “could be faster, more accurate and more resilient than existing weapon systems, could limit the casualties of war, and could protect “our people from harm by automating ‘dirty and dangerous’ tasks””, the committee was apparently unconvinced, concluding that “although a balance sheet of benefits and risks can be drawn, determining the net effect of AWS is difficult” – and that “this was acknowledged by the Ministry of Defence”.

Perhaps the  most important recommendation in the committee’s report relates to human control over autonomous weapons.  The committee found that:

The Government should ensure human control at all stages of an AWS’s lifecycle. Much of the concern about AWS is focused on systems in which the autonomy is enabled by AI technologies, with an AI system undertaking analysis on information obtained from sensors. But it is essential to have human control over the deployment of the system both to ensure human moral agency and legal compliance. This must be buttressed by our absolute national commitment to the requirements of international humanitarian law.

Read more

MoD AI projects list shows UK is developing technology that allows autonomous drones to kill

Omniscient graphic: ‘High Level Decision Making Module’ which integrates sensor information using deep probabilistic algorithms to detect, classify, and identify targets, threats, and their behaviours. Source: Roke

Artificial intelligence (AI) projects that could help to unleash new lethal weapons systems requiring little or no human control are being undertaken by the Ministry of Defence (MoD), according to information released to Drone Wars UK through a Freedom of Information Act request.

The development of lethal autonomous military systems – sometimes described as ‘killer robots’ – is deeply contentious and raises major ethical and human rights issues.  Last year the MoD published its Defence Artificial Intelligence Strategy setting out how it intends to adopt AI technology in its activities.

Drone Wars UK asked the MoD to provide it with the list of “over 200 AI-related R&D programmes” which the Strategy document stated the MoD was  working on.  Details of these programmes were not given in the Strategy itself, and MoD evaded questions from Parliamentarians who have asked for more details of its AI activities.

Although the Defence Artificial Intelligence Strategy claimed that over 200 programmes  are underway, only 73 are shown on the list provided to Drone Wars.  Release of the names of some projects were refused on defence, security and /or national security grounds.

However, MoD conceded that a list of “over 200” projects was never held when the strategy document was prepared in 2022, explaining that “our assessment of AI-related projects and programmes drew on a data collection exercise that was undertaken in 2019 that identified approximately 140 activities underway across the Front-Line Commands, Defence Science and Technology Laboratory (Dstl), Defence Equipment and Support (DE&S) and other organisations”.  The assessment that there were at least 200 programmes in total “was based on our understanding of the totality of activity underway across the department at the time”.

The list released includes programmes for all three armed forces, including a number of projects related to intelligence analysis systems and to drone swarms, as well as  more mundane  ‘back office’ projects.  It covers major multi-billion pound projects stretching over several decades, such as the Future Combat Air System (which includes the proposed new Tempest aircraft), new spy satellites, uncrewed submarines, and applications for using AI in everyday tasks such as predictive equipment maintenance, a repository of research reports, and a ‘virtual agent’ for administration.

However, the core of the list is a scheme to advance the development of AI-powered autonomous systems for use on the battlefield.  Many of these are based around the use of drones as a platform – usually aerial systems, but also maritime drones and autonomous ground vehicles.  A number of the projects on the list relate to the computerised identification of military targets by analysis of data from video feeds, satellite imagery, radar, and other sources.  Using artificial intelligence / machine learning for target identification is an important step towards the  development of autonomous weapon systems – ‘killer robots’ – which are able to operate without human control.  Even when they are under nominal human control, computer-directed weapons pose a high risk of civilian casualties for a number of reasons including the rapid speed at which they operate and difficulties in understanding the often un-transparent ways in which they make decisions.

The government claims it “does not possess fully autonomous weapons and has no intention of developing them”. However, the UK has consistently declined to support proposals put forward at the United Nations to ban them.

Among the initiatives on the list are the following projects.  All of them are focused on developing technologies that have potential for use in autonomous weapon systems.  Read more