Collateral Damage: Economics and ethics are casualties in the militarisation of AI

The current government places a central emphasis on technology and innovation in its evolving national security strategy, and wider approach to governance. Labour proposes reviving a struggling British economy through investment in defence with artificial intelligence (AI) featuring as an important component. Starmer’s premiership seems to align several objectives: economic growth, defence industrial development and technological innovation.

Rachel Reeves and John Healey hold roundtable with military company bosses, in front of Reaper drone at RAF Waddington, Feb 2025. Image: MoD

Taken together, these suggest that the government is positioning AI primarily in the context of war and defence innovation. This not only risks undermining the government’s stated ambitions of stability and economic growth but a strategy that prioritises speed over scrutiny, to the neglect of important ethical concerns.

The private defence industry has been positioned as an important pillar of this strategy. Before the Strategic Defence Review (SDR) was published, Chancellor Rachel Reeves and Defence Secretary John Healey initiated a Defence and Economic Growth Task Force to drive UK growth through defence capabilities and production. Arms companies are no longer vital for the purposes of national security but now presented as engines of future prosperity. AI is central to this, consistently highlighted in government communications John Healy has explicitly acknowledged that AI will increasingly power the British military whilst Kier Starmer stated that AI ‘will drive incredible change’.

UK focusing AI on military applications

The AI Action Plan, released in January 2025, explicitly links AI to economic growth. Although this included references to ‘responsible use and leadership’, the government has now shifted emphasis on military applications at the expense of crucial policy areas. On the 4th of July, the Science and Technology Secretary Peter Kyle wrote to the Alan Turing Institute – Britain’s premier AI research organization – to refocus research on military applications of AI. The Institutes prior research agenda spanned environmental sustainability, health and national security; under this new directive priorities are fundamentally being narrowed.

BAE Systems Project Odyssey uses AI and VR to make training ‘more realistic’. Image: BAE Systems

Relatedly, the Industrial Strategy released by the government aims to ‘embolden’ the UK’s digital and technologies economy, with £500 million to be delivered through a sovereign AI unit – this however will be focused on ‘building capacity in the most strategically important areas’. Given Peter Kyles re-direction and the overwhelming emphasis the government has placed on AI’s productive capacity in war, it becomes clear that AI research in defence will be at the cost of socially beneficial research in the case of the Alan Turing Institute.

Take Britain’s bleak economic outlook: sluggish productivity; post-Brexit stagnation; strained public finances; mounting government debt repayments; surging costs of living and inflating house prices. There is little evidence to suggest that defence-led growth will yield impactful returns on this catalogue of challenges. No credible economist is going to advise, in the face of these challenges, that investing in defence and redirecting research on AI in the name of national security, is going to give a better return on investment.

Research conducted in America illustrates that investing in health, education, infrastructure and green investment is more likely to give better returns on individual income specifically and broadly, the country’s prospects. Similarly, Lord Blunkett (former minster under Blair) pointed out that without GDP growth, raising defence spending as a share of GDP may not increase the actual funding.

Concerning applications to health outcomes, in August the World Economic Forum reported AI’s striking potential in doubling accuracy in the examination of brains in stroke patients, detecting fractures often missed in overstretched departments and predicting diseases with high confidence. This is critical given the NHS’s persistent challenges: long waiting times, underfunding, regional inequality, staff shortages and bureaucratic inertia.

Health and economic growth are closely related: healthier individuals are more productive, children attend school more consistently, preventative care lowers long-term costs – fundamentally strong health systems add value to the economy and our lives . Yet health is just one example. We are in the embryonic stages of AI development, and by prioritising research on military applications over civilian ones with public value, the government risks undermining, not fuelling long-term economic growth.

Crucially, framing arms companies as a major engine of economic growth is wildly misleading and economically unfounded. Arms sales account for 0.004% of the treasury’s total revenue and the defence industrial base accounts for only 1% of UK economic output. This sector is highly monopolized and so the benefits of ‘growth’ are concentrated among a handful of dominant corporations. Even then, the profit generated will not be reinvested into the UK. The biggest arms company in the UK – BAE Systems – is essentially a joint US-UK company with most of its capital invested in the US with majority shareholders emanating from US investment companies like BlackRock.

Prioritising speed over scrutiny

Beyond the economics, this is part of a wider strategy that signals a growing dismissal of ethical concerns, prioritising speed over scrutiny. The SDR acknowledged that technology is outpacing regulatory frameworks, noting that ‘the UK’s competitors are unlikely to adhere to common ethical standards’. In April 2025, Matthew Clifford – AI advisor to the PM – has been quoted saying ‘speed is everything’. While the Ministry of Defence (in 2022) promised to take an ‘ambitious, safe and responsible’ approach to the development of military AI, the current emphasis on speed sidelines important ethical concerns in the rush for military-technological superiority.

Militarily, the SDR makes plans to invest in drones, autonomous systems and £1 billion for a ‘digital targeting web’. A key foundational principle of International Humanitarian Law is the protection of civilians and their distinction with military targets. An AI-enabled ‘digital targeting web’ – like the one proposed in the SDR – connects sensors and weapons enabling faster detection and killing of human life. These networks would be able to identify and suggest targets faster than humans ever could, leaving soldiers in the best case, minutes, and the worst case, seconds to decide whether the drone should kill.

Digital Warfare: US and UK forces at the Combined Air Operations Center (CAOC), Al Udeid Air Base, Qatar,

One notable example is the Maven Smart System, recently procured by NATO. According to the US Think Tank, the Centre of Security and Emerging Technology, the system makes possible small armies to make ‘1000 tactical decisions per hour’. Some legal scholars have pointed out that the prioritisation of speed, within AI-powered battleground technology, raises questions surrounding the preservation of meaningful human control and restraint in warfare. Israeli use of AI-powered automated targeting systems such as ‘Lavender’ during its assault and occupation of Gaza is illustrative of this point. Systems such as these have been highlighted as one of the factors behind the shockingly high civilian death toll there.

This problem is compounded by the recent research that has shown that new large language models are known to ‘hallucinate’ – producing outputs in error or made up. As these systems become embedded within military decision-making chains, the risk of escalation due to technical failure increase dramatically. A false signal, misread sensor or a corrupted database could lead to erroneous targeting, or unintended conflict escalation.

In sum, the UK’s current approach – predominantly framing AI’s utility though the lens of defence – risks squandering its broader social and economic potential. The redirection of public research institutes, the privileging of AI investment in military applications (or so-called ‘strategic areas’) and the emphasis on speed over scrutiny raises serious concerns. Ethically, the erosion of meaningful human control in battlefield decision-making, the risk of AI-driven conflict escalation and the disregard of international humanitarian principles points to a troubling trajectory. The UK risks drifting towards the ethical standards of Russia and Israel in its use of military AI. A government approach to AI grounded in human security (freedom from fear and want), not war is not only more ethical but far more likely to generate sustainable economic growth for the United Kingdom.

  • Matthew Croft is a postgraduate student at Kings College London studying Conflict, Security and Development with a particular interest on the ethics of national security and the politics of technology.

RAF’s new armed drone given approval to fly freely over UK  

Protector RG1 flying over RAF Waddington. Crown Copyright.

The UK’s Military Aviation Authority (MAA) has issued ‘Military Type Certification’ to the UK’s new ‘Protector’ armed drone, meaning that it is now free to fly within UK airspace, including over populated areas.

Previously, for safety reasons, Protector and other large uncrewed systems such as the Protector’s predecessor, the Reaper, were only allowed to fly in segregated airspace, with other aircraft excluded.  Although large military drones are spreading rapidly, as Drone Wars has documented they continue to tumble out of the skies for a whole variety of reasons.

The UK is the first country to certify a large drone to fly freely in unsegregated airspace and General Atomics, the manufacturer of the drone – which they call MQ-9B SkyGuardian rather than UK designation of ‘Protector RG1’– were delighted as it has huge implications for their sales.  The company’s press release called it “a seminal achievement.”  A key element of the  approval, alongside “rigorous testing”, was apparently the ‘rigid separation’ of mission software from flight critical software.

Protector flights in the UK

The Protector has been undertaking a short series of test flights around RAF Waddington, the home of UK drone warfare, over the past few weeks. The Aviationist noted two tests in the past week which were of the longest duration so far, including one which saw the drone fly to RAF Marham before taking off and returning to Waddington.  RAF Marham is the nominated diversion airfield for the drone.

General Atomics reported that 10 of the 16 Protector drones ordered had now been delivered to the UK but it is not clear if these are all at RAF Waddington  as previous drones that have ‘been delivered’ to the RAF remained in the US for testing and trials.  The UK is increasingly secretive about its drone operations and exact details about when Protector is to come into service have been given vaguely as ‘by the end of 2025’.  Reaper is also expected to exit service by the end of the year.

Protector test and training flights are now likely to expand both in number and in range, including flights to launch weapons at Holbeach Air Weapons Range, near Boston in The Wash. Protector carries the Paveway IV guided bomb and Brimstone 3 missiles.

The Ministry of Defence has always been clear that Protector will also be available to support counter-terrorism operations within the UK and undertake Military Aid to Civilian Authorities (MACA) tasks such as assisting HM Coastguard with search and rescue missions.  Read more

Outdragon revealed: UK secretly using US signal intelligence pod on drone operations

US MQ-9 Reaper drone carrying surveillance pod flying over a Polish base.  Credit: The Aviationist

Drone Wars UK can reveal that British armed Reaper drones have secretly been equipped with a US intelligence gathering capability called ‘Outdragon’ since around 2019.

Signal Intelligence (SIGINT) pods on US Reaper and Predator drones have been used to geolocate, track and kill individuals via signals from mobile phones, wireless routers or other communication devices using a variety of systems developed by intelligence agencies with codenames such as Airhandler and Gilgamesh.

In response to our FoI requests on the capability, the Ministry of Defence is refusing to confirm or deny any information other than the existence of a 2019 contract to integrate it with UK Reaper drones.

The existence of Outdragon and its use by the UK was confirmed by the (possibly mistaken) publication online of a series of MoD maintenance forms relating to the UK’s new MQ-9  ‘Protector’ drone.

Image from: Flying Log and Fatigue Data Sheet – MOD Form 725(Protector RG-1)(AV)

Documents released by Edward Snowden show that UK AIRHANDLER missions are developed and controlled from the UK’s Joint Service Signals Unit (JSSU) at RAF Digby, which is the nearest military base to the home of UK drone warfare, RAF Waddington.  A 2017 Intercept article, based on documents from Snowden, showed that US and British intelligence officials worked “side by side” at the base using AIRHANDLER with UK Reaper drones to gather data and develop near real-time intelligence for military and intelligence operations. Read more

The Next Wave: the emergence of drones at sea

Click to open report

In recent months maritime drones have hit the news headlines as they are increasingly deployed in conflict hot-spots around the world’s seas.  The war in Ukraine, tensions in the ocean around China, and most recently armed attacks on shipping in the Arabian Gulf and Red Sea have all been characterised by the use of various types of drones – uncrewed aircraft, drone boats, unpowered marine ‘glider’ craft, and underwater vehicles.

Our new study, ‘The Next Wave’, investigates the development and use of maritime drones and the likely future implications of their use in combat.  While uncrewed boats have long been used in warfare – with the US Navy first using uncrewed underwater vehicles (UUVs) for mine clearance in the 1990’s – today, maritime drones are used by an increasing number of states and non-state groups.  This study reviews the reasons why,  summarizes developments by the major military powers and the UK, and examines a set of case studies to identify how drones have been used during different types of conflict at sea.

Why drones at sea?

Maritime drones are a fraction of the cost of a conventional destroyer or submarine and represent a new vision of naval warfare that exchanges small numbers of high-value military assets for large numbers of cheaper, flexible, and simpler platforms which, working together, have a greater overall capability.  In this vision, platforms can be modular, able to carry a number of payloads such as weapons, sensors, or smaller drones depending on the mission, and work as a connected network using artificial intelligence computing methods to stay in touch with other members of the fleet and with human controllers.  An adversary would be overwhelmed with a multitude of small targets instead of a few large warships.

Drones can gather information about the ocean more cheaply than larger crewed vessels, and may also be able to reach areas that would be inaccessible for a larger ship.  They are not bounded by the physiological limitations of human personnel and can undertake assignments that humans find demanding, such as deep diving or an extended submarine mission.  They are also more easily able to loiter undetected than a larger ship, allowing data to be collected over a longer time period, and can also allow potentially dangerous objects to be examined remotely, reducing risks.

Within the world’s vast oceans, certain locations are particularly strategically important for both military and civilian purposes. These include the Arabian Gulf, the Red Sea, areas around disputed islands in the South and East China Seas, the Greenland-Iceland-UK Gap, the Baltic Sea and the English Channel.  These areas often represent choke points and are both crowded with marine traffic and focal points for concentrations of underwater infrastructure.  Drone networks are an attractive option for military planners when undertaking surveillance and reconnaissance operations in such areas.

Military development of maritime drones

The world’s major military powers are all keen to develop drones for use in warfare, recognising the military potential of new technologies, and have all begun research and investment into next-generation weaponry and technologies such as artificial intelligence (AI) and uncrewed and autonomous systems.  China, Russia and the US and its NATO allies have a highly competitive relationship in these fields, and are actively developing such capabilities, including systems for use in the maritime domain, with Russia lagging somewhat behind China and the US.

Like the larger powers, the UK is keen to exploit the potential of uncrewed and autonomous technology for military purposes.  The Royal Navy sees maritime autonomous systems as a major component of its future fleet, operating on and under the sea and in the air on both front line logistics and support tasks.  To date, the UK has used uncrewed technologies to undertake routine tasks such as survey work and dangerous operations such as minesweeping, and in the longer term it has the aims of automating and roboticising many of the roles of its capital ships and equipping them with uncrewed aerial, surface, and undersea vehicles to contribute to a low cost weaponised sensor network.  The systems currently deployed by the UK are still mainly small scale and / or experimental system and the sums of money involved have been relatively modest. Read more

Cyborg Dawn?  Human-machine fusion and the future of warfighting

Click to open report

Soldiers who see in the dark, communicate telepathically, or fly a drone by thought alone all sound like characters from in a science fiction film.  Yet research projects investigating all these possibilities are under way in laboratories and research centres around the globe as part of an upsurge of interest in the possibilities of human enhancement enabled largely by expanding knowledge in the field of neuroscience: the study of the human brain and nervous system.

In order to help in understanding the possibilities and hazards posed by human enhancement technology, Drone Wars UK is publishing ‘Cyborg Dawn?‘, a new study investigating the military use of human augmentation.

Human enhancement –  a medical or biological intervention to the body designed to improve performance, appearance, or capability beyond what is necessary to achieve, sustain or restore health – may lead to fundamentally new concepts of warfare and can be expected to play a role in enabling the increased use of remotely operated and uncrewed systems in war.

Although military planners are eager to create ‘super soldiers’, the idea of artificially modifying humans to give them capabilities beyond their natural abilities presents significant moral, legal, and health risks.  The field of human augmentation is fraught with danger, and without stringent regulation, neurotechnologies and genetic modification will lead us to an increasingly dangerous future where technology encourages and accelerates warfare.  The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force.  There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners intend to develop.  Read more

The UK and the Ukraine War: Drones vs Diplomacy

Custom-built British ‘suicide-drone’ reportedly bound for Ukraine.     Pic: QinetiQ

The UK is to supply Ukraine with “hundreds of new long-range attack drones” a government spokesperson told the media on Monday as the Prime Minister Rishi Sunak welcomed President Volodymyr Zelenskiy to Britain for a brief visit.

“Today the prime minister will confirm the further UK provision of hundreds of air defence missiles and further unmanned aerial systems including hundreds of new long-range attack drones with a range of over 200km. These will all be delivered over the coming months as Ukraine prepares to intensify its resistance to the ongoing Russian invasion.”

It is not at all clear what theses ‘long range attack drones’ are, although there has been some reports of the UK funding the development of a ‘suicide-drone’ to supply to Ukraine.

This latest news comes on top of the announcement in the last few weeks that the UK is supplying Storm Shadow cruise missiles to Ukraine following the export of UK Challenger 2 tanks.

Some will no doubt welcome the supply of attack drones and cruise missiles to Ukraine as a counter to Russia’s military aggression. It goes without saying that Russia’s invasion of Ukraine and continuing use of lethal force is unlawful and must be resisted.   However, there are real questions to be asked now about how such a strategy of supplying evermore lethal military hardware risks expanding rather than ending this war. It is becoming increasingly easier to see the UK and other NATO countries being drawn more directly into an armed conflict with Russia.  Any such escalation would be disastrous for the people of Ukraine and the wider region as well as seriously risking a catastrophic nuclear event.

Rather than escalating the conflict by supplying ever more lethal arms, the UK should be urging negotiations to end the war as it is inevitable that this will have to happen at some point.  While some western military analysts urge that the war should be prolonged in order to weaken Russia in the long term, Ukraine and its people suffer.

Negotiations are of course a matter for the Ukrainian people, but it should be remembered that a settlement  was seemingly very close last March with a Turkish-backed plan for Russian forces to withdraw to their pre-24 February positions without Ukraine giving up its claim to any of its territory.  Unfortunately the moment passed (with suggestions that the then British PM Boris Johnson personally lobbied Zelenskiy to reject the plan (for more on this see  Ukraine One Year On: Time to Negotiate Peace).

While it is easy for the current PM to grab a few headlines and play to the crowd by supplying lethal attack drones to Ukraine, the harder but more rewarding long-term work of diplomacy in order to end this awful war is being neglected.