Lucky Dip: Drone companies await spending bonanza as Defence Investment Plan (DIP) to be revealed.   

Following the government’s commitment to increase military spending and the publication of the Strategic Defence Review (SDR) in early June, the military industry has been keenly awaiting the release of the government’s Defence Investment Plan (DIP) which will layout military spending plans and other details for the rest of this parliament. Numerous reports have indicated that many planned projects are ‘on hold’ until the plan is finalised and published.

UK Military Spending 2010/11 – 2024/25 – Statista

Defence minister Luke Pollard told MPs in June that the DIP will “cover the full scope of the defence programme, from people and operations to equipment and infrastructure”. Time and again ministers have promised that the plan will be unveiled in the autumn and so this now seems likely to be soon after the Budget of 26 November (although such promises are of course routinely broken).

How much?!

UK military spending was £60.2bn in 24/25 (around 2.4% of GDP), up from £42.4bn in 2020/21. In February 2025, the Starmer government committed to further increase military spending raising the budget to 2.5% of GDP by 2027 (estimated at around an extra £6bn per year – roughly the amount cut from the UK’s Aid budget) with ‘an ambition’ to reach 3% by the next parliament.  At the NATO summit in June 2025, however, Starmer upped the ante, with a pledge to reach a ‘goal’ of 5% (3.5% on ‘core defence’ (estimated to be an extra £30bn per year) with 1.5% (around £40bn per year) on ‘defence-related areas such as resilience and security’) by 2029. Subsequently the government said it “expected to reach at least 4.1% of GDP in 2027”.

‘Whole of Society’

Importantly, alongside the increase in military spending, the Strategic Defence Review argued that ‘defence’ is now to be seen as a ‘whole of society’ effort and this may well be re-emphasised when DIP is published.

The plan is being billed as enabling the UK to be at ‘warfighting readiness’ and alongside equipment and weapons programmes, the public is being urged to be ”prepared for conflict and ready to volunteer, support the military, and endure challenges”.

Plans already announced to ‘reconnect society with the military’ include the expansion of youth cadet forces, education work in schools to develop understanding among young people of the armed forces, and broader public outreach events to outline the threats and the need for greater military spending despite increased social challenges.

Government keen to ‘reconnect’ young people with the armed forces

And to top this off, the government is deploying the hoary old chestnut that military spending is good for the economy (despite such claims being persistently and thoroughly debunked).

Trailed Plans

While specific spending details remain under wraps, government announcements since the publication of the SDR have indicated some of the broad areas which will receive more funding:

Drones, Drones, Drones. In the Spring Statement, Chancellor Rachel Reeves stated that “a minimum of 10% of the MoD’s equipment budget is to be spent on novel technologies including drones and AI enabled technology.”  Defence Minister Alistair Cairns indicated in July that there would be around £4bn spending on uncrewed systems – ‘Drones, drones and drones‘ as he put it on twitter. 

To the ever-expanding list of UK drone development programmes, many of which are seeking funding decisions as part of the DIP, we can add Project Nyx which seeks to pair a new drone with the British Army’s Apache Helicopter. 

Perhaps most significantly in this area, publication of the Defence Investment Plan may illuminate UK plans for a ‘loyal wingman’ type drone  – now described by the MoD as an Autonomous Collaborative Platform (ACP) – to accompany the UK’s planned new fighter aircraft, Tempest. While some funding has already been allocated to develop smaller Tier 1 and 2 ACP’s, plans for the more strategic and no doubt costlier level Tier 3 drone have been placed on the back burner pending funding decisions.  Will the UK go it alone and build a new armed drone (as no doubt BAE Systems hopes) or will it buy Australia’s Ghost Bat or one of the two drones currently competing for the US contract?

Integrated targeting web. Alongside new drones, the UK is developing a ‘digital targeting web’ to link, as MoD-speak puts it,  ‘sensors’, ‘deciders’ and ‘effectors’.  In other words commanders supported by AI will be networked with ‘next generation’ drones, satellites and other systems to identify targets to be destroyed by a variety of novel and traditional military systems. The aim is to rapidly speed up the time between target identification and attack.  As Drone Wars has reported, several tests of various elements of this system (such as ASGARD) have been tested and it is likely that further funding for this programme will be part of the DIP.

New munition and drone factories.  The government is keen to bolster the UK’s munitions stocks after supplying huge amounts to Ukraine. The MoD accidentally released details of 12 potential sites for new munitions factories to The Ferret in a Freedom of Information mix-up.  The government has plans to open 6 new factories at a cost of £6bn,   

Helsing factory

Alongside this, there is also a desire to persuade some of the newer drone companies to open factories here in the UK. While Tekever has announced it will open a new site in Swindon, Anduril and Helsing seem to be keeping their power dry while awaiting news that they have secured government contracts before committing to setting up premises.  Both companies have, however, set up UK subsidiaries and have launched PR campaigns to persuade ministers and officials of the efficacy of their products.

While drones are key for these companies, a huge increase in UK spending on military AI systems is also in their sights.

An AI ‘Manhattan Project’ endeavour.  Despite continued and significant concerns about the military use of AI, particularly in ‘the kill chain’, ministers, officials and commanders seem convinced that a rapid integration of AI into all areas of the armed forces is urgent and vital.  Just before stepping down as Chief of the Defence Staff in September, Admiral Sir Tony Radakin put his weight behind calls from Helsing co-founder Gundbert Scherf for a “Manhattan-Project for AI defence”.  Arguing such a plan “would not cost the earth” (but putting it at around $90bn!) Scherf suggested four areas to concentrate on: a) masses of AI-enabled defensive drones deployed on NATO’s eastern flank;  b) deploying AI-enabled combat drones to dominate airspace; c) large scale deployment of ai-enabled underwater drones/sensors; and finally, d) replacing Europe’s ageing satellites with (you guessed it) ai-enabled surveillance and targeting satellites.

Anduril is also not shy of lobbying in its own interests. Anduril UK CEO Richard Drake told The House, Parliament’s in-house magazine, that Anduril US was “very much happy with the direction [the SDR is] taking” but went on to publicly push to reduce regulation on the use of drones in UK airspace:

“For UK PLC to get better and better and better in drones and autonomous systems, they have to always look at their regulatory rules as well. Companies like ours and other UK companies can design and build these really cool things, but if we can’t test them well enough in the UK, that’s going to be a problem.”

Winners and Losers

While wholesale adoption of Helsing’s plan seems unlikely, there seems little doubt that the new AI-focused military companies will be among the various military companies who will be the lucky beneficiaries of the UK’s DIP.  Meanwhile, the rest of us seem assured of spending cuts and tax rises.  

Their drones bad! Our drones good! Defence Secretary announces drones to be shot down

Media reports today (20 October 2025) indicate that the Defence Secretary, John Healey, will announce new powers that will allow military personnel to shoot down drones threatening military bases and possibly other sites.

Over the past year there has been a number of sightings of unidentified drones in the vicinity of military bases both within the UK and across Europe. 

UK troops engaged in counter-drone exercise. Credit: MoD

While its perfectly possible that these are drones flown carelessly by hobby pilots as their numbers rapidly increase, there has been speculation by some that these sightings are connected to a co-ordinated campaign by adversaries seeking intelligence or to simply to test military and security responses. No evidence for such a claim, however, has been presented.

The sightings, along with a number of cases of drones straying across borders from the war in Ukraine, have been taken up but those arguing that the UK is facing grave security threats now from state adversaries rather than terrorist groups and that the UK needs to rapidly increase military spending and accept that it is in a ‘pre-war situation’. However, calm heads need to prevail.

Campaigners have been arguing for 15 years that the advent of drone technology makes the world a much less safe place.  Remote and autonomous drones enable the use of lethal force with virtual impunity and create real and genuine fear.

While ordinary people living under drones around  the world constantly feel threated and suffer real physical and psychological harm from military drones flying overhead, British politicians have regularly dismissed such fears, arguing that the drones are there in fact to create peace for the people on the ground.

It is ironic then, not to say hypocritical, that fear and apprehension about possible drone incursions within the UK is met with strong government response including ordering the military to shoot such drones down.

Next month, the UK will release its Defence Investment Plan which is likely to see further spending on drones and counter-drone technology.  Rather than spending vast sums on new military technology which will simply proliferate and make the world – and ourselves – much less safe, we need to be investing in building global co-operation and common security, accepting that no nation can be truly secure unless all feel secure. 

Rather than squandering billions developing drones and then have to spend more on counter-drone technology, we should be investing much more in diplomacy and conflict prevention structures; we should be investing in our health and social care; investing in greening the economy and focusing our extremely talented engineers and scientists on help to tackle climate changes rather than developing new war technology.

Drone footage shows ‘manifestly unlawful’ US strike on civilians; Trump vows to rip-up drone treaty

The US killed 11 people in a reported drone strike on a small boat in the Caribbean Sea on 3 September. Although it has not been confirmed that the strike was carried out by a drone, President Trump shared drone footage of the strike on his social media. In August it was revealed that Trump had secretly signed a directive ordering the Pentagon to begin military  operations against drug cartels.

Screen grab from drone video shared by President Trump.

While US officials alleged that the boat targeted was carrying drugs being transported by members of the Tren de Aragua cartel, multiple legal scholars and experts have argued that the strike was “manifestly unlawful.”

Professor Luke Moffett of Queens University Belfast told the BBC that while “force can be used to stop a boat, generally this should be non-lethal measures.” Any use of force must be “reasonable and necessary in self-defence where there is immediate threat of serious injury or loss of life to enforcement officials.”  The US and other states regularly stop boats in international waters as part of law enforcement activity without resorting to the use of lethal force.   

Much more significantly, however, is the grave violation of international law that is deliberate, premeditated targeting of civilians. Claire Finkelstein, professor of national security law at the University of Pennsylvania, said “There’s no authority for this whatsoever under international law. It was not an act of self-defense. It was not in the middle of a war. There was no imminent threat to the United States.”  Finklestein went on to make the clear and obvious connection between the strike and the on-going, two-decades long US drone targeted killing programme which has significantly blurred the lines between law enforcement and armed conflict.

While the US alleges that the occupants of the boat were members of an organised criminal gang and President Trump and other administration officials have began to publicly talk about the threat of ‘Narco terrorists’, that in no way makes the targets of this strike combatants under the laws of war.  While civilians are regularly and persistently victims of  drone and air strikes, the deliberate targeting of non-combatants is still shocking.

New York University law professor Ryan Goodman, who previously worked as a lawyer in Pentagon, told the New York Times that “It’s difficult to imagine how any lawyers inside the Pentagon could have arrived at a conclusion that this was legal rather than the very definition of murder under international law rules that the Defense Department has long accepted.”

In the aftermath of the strike and questioning by the media, administration officials struggled to justify the legality of the strike, resorting to arguing that it was a matter of self-defence. Significantly, senior officials said that further such operations were likely

Trump and the MTCR

Meanwhile, President Trump is reportedly returning to a plan formulated during his first administration to overturn controls on the export of US armed drones. Trump attempted in 2020, as we reported, to get the other state signatories of the Missile Technology Control Regime (MTCR) to accept that Predator/Reaper-type drones should be moved out of the most strongly controlled group (Category I) into the lesser group (Category II). Other states, however, gave this short shrift, much to Trumps annoyance.     

According to the Reuters report, the new move involves “designating drones as aircraft… rather than missile systems”  which will enable the US to then “sidestep” its treaty obligations. The move will aid US plans to sell hundreds of armed drones to Saudi Arabia, UAE and Qatar.  

Whether this will convince other states is highly doubtful, but it is likely that Trump and his administration will not care. Such a move will of course open the flood gates for other states to unilaterally reinterpret arms control treaties in their favour in the same way and will also likely spur the proliferation of armed drones which will only further increase civilian harm.  

Collateral Damage: Economics and ethics are casualties in the militarisation of AI

The current government places a central emphasis on technology and innovation in its evolving national security strategy, and wider approach to governance. Labour proposes reviving a struggling British economy through investment in defence with artificial intelligence (AI) featuring as an important component. Starmer’s premiership seems to align several objectives: economic growth, defence industrial development and technological innovation.

Rachel Reeves and John Healey hold roundtable with military company bosses, in front of Reaper drone at RAF Waddington, Feb 2025. Image: MoD

Taken together, these suggest that the government is positioning AI primarily in the context of war and defence innovation. This not only risks undermining the government’s stated ambitions of stability and economic growth but a strategy that prioritises speed over scrutiny, to the neglect of important ethical concerns.

The private defence industry has been positioned as an important pillar of this strategy. Before the Strategic Defence Review (SDR) was published, Chancellor Rachel Reeves and Defence Secretary John Healey initiated a Defence and Economic Growth Task Force to drive UK growth through defence capabilities and production. Arms companies are no longer vital for the purposes of national security but now presented as engines of future prosperity. AI is central to this, consistently highlighted in government communications John Healy has explicitly acknowledged that AI will increasingly power the British military whilst Kier Starmer stated that AI ‘will drive incredible change’.

UK focusing AI on military applications

The AI Action Plan, released in January 2025, explicitly links AI to economic growth. Although this included references to ‘responsible use and leadership’, the government has now shifted emphasis on military applications at the expense of crucial policy areas. On the 4th of July, the Science and Technology Secretary Peter Kyle wrote to the Alan Turing Institute – Britain’s premier AI research organization – to refocus research on military applications of AI. The Institutes prior research agenda spanned environmental sustainability, health and national security; under this new directive priorities are fundamentally being narrowed.

BAE Systems Project Odyssey uses AI and VR to make training ‘more realistic’. Image: BAE Systems

Relatedly, the Industrial Strategy released by the government aims to ‘embolden’ the UK’s digital and technologies economy, with £500 million to be delivered through a sovereign AI unit – this however will be focused on ‘building capacity in the most strategically important areas’. Given Peter Kyles re-direction and the overwhelming emphasis the government has placed on AI’s productive capacity in war, it becomes clear that AI research in defence will be at the cost of socially beneficial research in the case of the Alan Turing Institute.

Take Britain’s bleak economic outlook: sluggish productivity; post-Brexit stagnation; strained public finances; mounting government debt repayments; surging costs of living and inflating house prices. There is little evidence to suggest that defence-led growth will yield impactful returns on this catalogue of challenges. No credible economist is going to advise, in the face of these challenges, that investing in defence and redirecting research on AI in the name of national security, is going to give a better return on investment.

Research conducted in America illustrates that investing in health, education, infrastructure and green investment is more likely to give better returns on individual income specifically and broadly, the country’s prospects. Similarly, Lord Blunkett (former minster under Blair) pointed out that without GDP growth, raising defence spending as a share of GDP may not increase the actual funding.

Concerning applications to health outcomes, in August the World Economic Forum reported AI’s striking potential in doubling accuracy in the examination of brains in stroke patients, detecting fractures often missed in overstretched departments and predicting diseases with high confidence. This is critical given the NHS’s persistent challenges: long waiting times, underfunding, regional inequality, staff shortages and bureaucratic inertia.

Health and economic growth are closely related: healthier individuals are more productive, children attend school more consistently, preventative care lowers long-term costs – fundamentally strong health systems add value to the economy and our lives . Yet health is just one example. We are in the embryonic stages of AI development, and by prioritising research on military applications over civilian ones with public value, the government risks undermining, not fuelling long-term economic growth.

Crucially, framing arms companies as a major engine of economic growth is wildly misleading and economically unfounded. Arms sales account for 0.004% of the treasury’s total revenue and the defence industrial base accounts for only 1% of UK economic output. This sector is highly monopolized and so the benefits of ‘growth’ are concentrated among a handful of dominant corporations. Even then, the profit generated will not be reinvested into the UK. The biggest arms company in the UK – BAE Systems – is essentially a joint US-UK company with most of its capital invested in the US with majority shareholders emanating from US investment companies like BlackRock.

Prioritising speed over scrutiny

Beyond the economics, this is part of a wider strategy that signals a growing dismissal of ethical concerns, prioritising speed over scrutiny. The SDR acknowledged that technology is outpacing regulatory frameworks, noting that ‘the UK’s competitors are unlikely to adhere to common ethical standards’. In April 2025, Matthew Clifford – AI advisor to the PM – has been quoted saying ‘speed is everything’. While the Ministry of Defence (in 2022) promised to take an ‘ambitious, safe and responsible’ approach to the development of military AI, the current emphasis on speed sidelines important ethical concerns in the rush for military-technological superiority.

Militarily, the SDR makes plans to invest in drones, autonomous systems and £1 billion for a ‘digital targeting web’. A key foundational principle of International Humanitarian Law is the protection of civilians and their distinction with military targets. An AI-enabled ‘digital targeting web’ – like the one proposed in the SDR – connects sensors and weapons enabling faster detection and killing of human life. These networks would be able to identify and suggest targets faster than humans ever could, leaving soldiers in the best case, minutes, and the worst case, seconds to decide whether the drone should kill.

Digital Warfare: US and UK forces at the Combined Air Operations Center (CAOC), Al Udeid Air Base, Qatar,

One notable example is the Maven Smart System, recently procured by NATO. According to the US Think Tank, the Centre of Security and Emerging Technology, the system makes possible small armies to make ‘1000 tactical decisions per hour’. Some legal scholars have pointed out that the prioritisation of speed, within AI-powered battleground technology, raises questions surrounding the preservation of meaningful human control and restraint in warfare. Israeli use of AI-powered automated targeting systems such as ‘Lavender’ during its assault and occupation of Gaza is illustrative of this point. Systems such as these have been highlighted as one of the factors behind the shockingly high civilian death toll there.

This problem is compounded by the recent research that has shown that new large language models are known to ‘hallucinate’ – producing outputs in error or made up. As these systems become embedded within military decision-making chains, the risk of escalation due to technical failure increase dramatically. A false signal, misread sensor or a corrupted database could lead to erroneous targeting, or unintended conflict escalation.

In sum, the UK’s current approach – predominantly framing AI’s utility though the lens of defence – risks squandering its broader social and economic potential. The redirection of public research institutes, the privileging of AI investment in military applications (or so-called ‘strategic areas’) and the emphasis on speed over scrutiny raises serious concerns. Ethically, the erosion of meaningful human control in battlefield decision-making, the risk of AI-driven conflict escalation and the disregard of international humanitarian principles points to a troubling trajectory. The UK risks drifting towards the ethical standards of Russia and Israel in its use of military AI. A government approach to AI grounded in human security (freedom from fear and want), not war is not only more ethical but far more likely to generate sustainable economic growth for the United Kingdom.

  • Matthew Croft is a postgraduate student at Kings College London studying Conflict, Security and Development with a particular interest on the ethics of national security and the politics of technology.

UK crossing the line as it implements use of AI for lethal targeting under Project Asgard

Despite grave ethical and legal concerns about the introduction of AI into decision making around the use of lethal force, the UK is rapidly pressing ahead with a number of programmes and projects to do so, with the British Army recently trialling a new AI-enabled targeting system called ASGARD as part of a NATO exercise in Estonia in May 2025.

A mock HQ utilising ASGARD at MoD briefing, July 2025. Crown Copyright 2025.

Last week, the Ministry of Defence (MoD) gave a briefing to selected media and industry ‘partners’ on Project ASGARD – which it describes as the UK’s programme to “double the lethality” of the British Army through the use of AI and other technology. ASGARD is not aimed at  producing or procuring a particular piece of equipment but rather at developing a communications and decision-making network that uses AI and other technology to vastly increase the speed of undertaking lethal strikes.

ASGARD is part of a £1 billion ‘Digital Targeting Web’ designed to “connect sensors, shooters, and decision-makers” across the land, sea, air, and space domains. “This is the future of warfare,” Maria Eagle, Minister for Defence Procurement and Industry told the gathering. 

According to one reporter present at the briefing, the prototype network “used AI-powered fire control software, low-latency tactical networks, and semi-autonomous target recommendation tools.” 

Janes reported that through ASGARD, “any sensor”, whether it be an unmanned aircraft system (UAS), radar, or human eye, is enabled by AI to identify and prioritise targets and then suggest weapons for destroying them. “Before Asgard it might take hours or even days. Now it takes seconds or minutes to complete the digital targeting chain,” Sir Roly Walker, Head of the British Army told the gathering.

Drones used in conjunction with ASGARD
DART 250EW one-way attack drone
Helsing HX-2 one-way attack drone

While the system currently has a ‘human in the loop’  officials suggested that this could change in future, with The I Paper reporting ‘the system is technically capable of running without human oversight and insiders did not rule out allowing the AI to operate independently if ethical and legal considerations changed.’

How it works

A British Army report after the media event suggested that  “Asgard has introduced three new ways of fighting designed to find, strike and blunt enemy manoeuvre: 

  • A dismounted data system for use at company group and below.
  • The introduction of the DART 250 One Way Effector. This enables the targeting of enemy infrastructure three times further than the current UK land based deep fires rockets.
  • A mission support network to accelerate what is called the digital targeting or ‘kill’ chain.

According to a detailed and useful write-up of the Estonia exercise, ASGARD uses existing equipment currently in service alongside new systems including Lattice command and control software from Anduril which provides a ‘mesh network’ for communications, as well as Altra and Altra Strike software from Helsing used to identify and ‘fingerprint’ targets. The report goes on:

“targets were passed to PRIISM which would conduct further development including legal review, collateral damage estimates, and weapon-to-target matching.”  

Helsing’s HX-2 drone was also used during the exercise and is another indication that the UK is likely to acquire these one-way attack drones. DART 250, a UK manufactured jet-powered one-way attack drone with a range of 250 km that can fly at more than 400 km/h was also deployed as part of the exercise. The manufacturer says that it can fly accurately even when GPS signals are jammed and that it is fitted with seeker that enables it to home-in and destroy jamming equipment.  

AI: speed eroding oversight and accountability

The grave dangers of introducing AI into warfare, and in particular for the use of force are by now well known.  While arguments have been made for and against these systems for more than a decade, increasing we are moving from a theoretical, future possibility to the real world: here, now, today.

While some argue almost irrationally in the powers and benefits of AI, in the real world AI-enabled systems remain error prone and unreliable. AI is far from fallible and relies on training data which time and time again have led to serious mistakes through bias.  

Systems like ASGARD may be able to locate tanks on an open plain in a well-controlled training exercise environment (see video above), the real world is very different.  Most armed conflicts do not take place in remote battlefields but in complex and complicated urban environments.  Relying on AI to choose military targets in such a scenario is fraught with danger.

Advocates of ASGARD and similar systems argue that the ‘need’ for speed in targeting decisions means that the use of AI brings enormous benefits.  And it is undoubtedly true that algorithms can process data much faster than humans. But speeding up such targeting decisions significantly erodes human oversight and accountability.  Humans in such circumstances are reduced to merely rubber-stamping the output of the machine.

Meanwhile, the Ministry of Defence confirmed that the next phase of ASGARD’s development has received government funding while at the UN, the UK continues to oppose the negotiation of a new legally binding instrument on autonomous weapons systems.

The UK’s forgotten war: British drone strikes continue against ISIS

Three weeks ago, on June 10, a British Reaper drone began tracking a motorcycle in north-western Syria near the border with Turkey as it began to be ridden by a someone described by British intelligence as “a known member” of ISIS. The individual, who had apparently been monitored by the drone “for some time” was tracked and killed by a Hellfire missile fired by the drone a short while later. 

Aftermath of UK drone strike in NW Syria, Jun 10 2025 : Image credit : The White Helmets

Local reports from the ground said the man was killed in the blast, with another person also injured and taken to hospital. This was the second British drone strike in north-west Syria this year and the only reason we know about it was a MoD spokesperson boasted about it to The Sun this weekend.

A forgotten, fitful war

For most, the US/UK war against ISIS in Iraq and Syria has been virtually forgotten   Other  awful conflicts –  in Ukraine, Gaza and Sudan –  have taken our attention over the past two years, not to mention the more recent unlawful Israeli and the US bombing of Iran. And in many ways this is understandable.  Russia’s illegal invasion of Ukraine and Hamas’ attack followed by Israel’s on-going genocidal war on Gaza has stunned the world.   

Yet, it should still matter  – particular to British public, media and parliamentarians  – that British forces continue to engage in a seemingly never ending, fitful war in Syria and Iraq.

MoD secrecy 

In addition, the war gets little attention because the Ministry of Defence (MoD) has decided it will no longer talk about ongoing UK military operations. After a decade of responding to our Freedom of Information (FoI) requests on the UK’s use of Reaper drones, for example, the MoD abruptly began to refuse them at the beginning of 2023 arguing that the changed global situation mean that oversight and transparency had to be curbed. Other organisations, journalists and parliamentary committees too have seen a decline in transparency from the MoD, both about UK military operations but also about UK military developments in general.

While MoD has argued that the ‘geopolitical situation’ means they have to be much more ‘circumspect’, the significant drop in the ability of the media, parliament and the public to scrutinise the MoD and hold the armed forces to account will no doubt be welcomed by them for a variety of reasons.   

Read more