Drone footage shows ‘manifestly unlawful’ US strike on civilians; Trump vows to rip-up drone treaty

The US killed 11 people in a reported drone strike on a small boat in the Caribbean Sea on 3 September. Although it has not been confirmed that the strike was carried out by a drone, President Trump shared drone footage of the strike on his social media. In August it was revealed that Trump had secretly signed a directive ordering the Pentagon to begin military  operations against drug cartels.

Screen grab from drone video shared by President Trump.

While US officials alleged that the boat targeted was carrying drugs being transported by members of the Tren de Aragua cartel, multiple legal scholars and experts have argued that the strike was “manifestly unlawful.”

Professor Luke Moffett of Queens University Belfast told the BBC that while “force can be used to stop a boat, generally this should be non-lethal measures.” Any use of force must be “reasonable and necessary in self-defence where there is immediate threat of serious injury or loss of life to enforcement officials.”  The US and other states regularly stop boats in international waters as part of law enforcement activity without resorting to the use of lethal force.   

Much more significantly, however, is the grave violation of international law that is deliberate, premeditated targeting of civilians. Claire Finkelstein, professor of national security law at the University of Pennsylvania, said “There’s no authority for this whatsoever under international law. It was not an act of self-defense. It was not in the middle of a war. There was no imminent threat to the United States.”  Finklestein went on to make the clear and obvious connection between the strike and the on-going, two-decades long US drone targeted killing programme which has significantly blurred the lines between law enforcement and armed conflict.

While the US alleges that the occupants of the boat were members of an organised criminal gang and President Trump and other administration officials have began to publicly talk about the threat of ‘Narco terrorists’, that in no way makes the targets of this strike combatants under the laws of war.  While civilians are regularly and persistently victims of  drone and air strikes, the deliberate targeting of non-combatants is still shocking.

New York University law professor Ryan Goodman, who previously worked as a lawyer in Pentagon, told the New York Times that “It’s difficult to imagine how any lawyers inside the Pentagon could have arrived at a conclusion that this was legal rather than the very definition of murder under international law rules that the Defense Department has long accepted.”

In the aftermath of the strike and questioning by the media, administration officials struggled to justify the legality of the strike, resorting to arguing that it was a matter of self-defence. Significantly, senior officials said that further such operations were likely

Trump and the MTCR

Meanwhile, President Trump is reportedly returning to a plan formulated during his first administration to overturn controls on the export of US armed drones. Trump attempted in 2020, as we reported, to get the other state signatories of the Missile Technology Control Regime (MTCR) to accept that Predator/Reaper-type drones should be moved out of the most strongly controlled group (Category I) into the lesser group (Category II). Other states, however, gave this short shrift, much to Trumps annoyance.     

According to the Reuters report, the new move involves “designating drones as aircraft… rather than missile systems”  which will enable the US to then “sidestep” its treaty obligations. The move will aid US plans to sell hundreds of armed drones to Saudi Arabia, UAE and Qatar.  

Whether this will convince other states is highly doubtful, but it is likely that Trump and his administration will not care. Such a move will of course open the flood gates for other states to unilaterally reinterpret arms control treaties in their favour in the same way and will also likely spur the proliferation of armed drones which will only further increase civilian harm.  

UK crossing the line as it implements use of AI for lethal targeting under Project Asgard

Despite grave ethical and legal concerns about the introduction of AI into decision making around the use of lethal force, the UK is rapidly pressing ahead with a number of programmes and projects to do so, with the British Army recently trialling a new AI-enabled targeting system called ASGARD as part of a NATO exercise in Estonia in May 2025.

A mock HQ utilising ASGARD at MoD briefing, July 2025. Crown Copyright 2025.

Last week, the Ministry of Defence (MoD) gave a briefing to selected media and industry ‘partners’ on Project ASGARD – which it describes as the UK’s programme to “double the lethality” of the British Army through the use of AI and other technology. ASGARD is not aimed at  producing or procuring a particular piece of equipment but rather at developing a communications and decision-making network that uses AI and other technology to vastly increase the speed of undertaking lethal strikes.

ASGARD is part of a £1 billion ‘Digital Targeting Web’ designed to “connect sensors, shooters, and decision-makers” across the land, sea, air, and space domains. “This is the future of warfare,” Maria Eagle, Minister for Defence Procurement and Industry told the gathering. 

According to one reporter present at the briefing, the prototype network “used AI-powered fire control software, low-latency tactical networks, and semi-autonomous target recommendation tools.” 

Janes reported that through ASGARD, “any sensor”, whether it be an unmanned aircraft system (UAS), radar, or human eye, is enabled by AI to identify and prioritise targets and then suggest weapons for destroying them. “Before Asgard it might take hours or even days. Now it takes seconds or minutes to complete the digital targeting chain,” Sir Roly Walker, Head of the British Army told the gathering.

Drones used in conjunction with ASGARD
DART 250EW one-way attack drone
Helsing HX-2 one-way attack drone

While the system currently has a ‘human in the loop’  officials suggested that this could change in future, with The I Paper reporting ‘the system is technically capable of running without human oversight and insiders did not rule out allowing the AI to operate independently if ethical and legal considerations changed.’

How it works

A British Army report after the media event suggested that  “Asgard has introduced three new ways of fighting designed to find, strike and blunt enemy manoeuvre: 

  • A dismounted data system for use at company group and below.
  • The introduction of the DART 250 One Way Effector. This enables the targeting of enemy infrastructure three times further than the current UK land based deep fires rockets.
  • A mission support network to accelerate what is called the digital targeting or ‘kill’ chain.

According to a detailed and useful write-up of the Estonia exercise, ASGARD uses existing equipment currently in service alongside new systems including Lattice command and control software from Anduril which provides a ‘mesh network’ for communications, as well as Altra and Altra Strike software from Helsing used to identify and ‘fingerprint’ targets. The report goes on:

“targets were passed to PRIISM which would conduct further development including legal review, collateral damage estimates, and weapon-to-target matching.”  

Helsing’s HX-2 drone was also used during the exercise and is another indication that the UK is likely to acquire these one-way attack drones. DART 250, a UK manufactured jet-powered one-way attack drone with a range of 250 km that can fly at more than 400 km/h was also deployed as part of the exercise. The manufacturer says that it can fly accurately even when GPS signals are jammed and that it is fitted with seeker that enables it to home-in and destroy jamming equipment.  

AI: speed eroding oversight and accountability

The grave dangers of introducing AI into warfare, and in particular for the use of force are by now well known.  While arguments have been made for and against these systems for more than a decade, increasing we are moving from a theoretical, future possibility to the real world: here, now, today.

While some argue almost irrationally in the powers and benefits of AI, in the real world AI-enabled systems remain error prone and unreliable. AI is far from fallible and relies on training data which time and time again have led to serious mistakes through bias.  

Systems like ASGARD may be able to locate tanks on an open plain in a well-controlled training exercise environment (see video above), the real world is very different.  Most armed conflicts do not take place in remote battlefields but in complex and complicated urban environments.  Relying on AI to choose military targets in such a scenario is fraught with danger.

Advocates of ASGARD and similar systems argue that the ‘need’ for speed in targeting decisions means that the use of AI brings enormous benefits.  And it is undoubtedly true that algorithms can process data much faster than humans. But speeding up such targeting decisions significantly erodes human oversight and accountability.  Humans in such circumstances are reduced to merely rubber-stamping the output of the machine.

Meanwhile, the Ministry of Defence confirmed that the next phase of ASGARD’s development has received government funding while at the UN, the UK continues to oppose the negotiation of a new legally binding instrument on autonomous weapons systems.

The Strategic Defence Review and Drone Warfare: Questioning a Dangerous Consensus

While there appears to be a consensus between mainstream political parties, officials and defence commentators that a significant increase in spending on drone and military AI systems would be a positive development, there are serious questions about the basis on which this decision is being made and the likely impact on global security.

New military technology in general, and uncrewed systems in particular, are being presented by politicians and the media as a quick and simple, cost-effective way for the armed forces to increase ‘mass’ and ‘lethality’ without having to procure hugely expensive kit that can take years to produce. Drones are also seen as an alternative to deploying troops in significant numbers at a time when recruitment has become increasingly difficult.

However, far from aiding security, increased spending on drones, autonomous weapons and other emerging military technology will simply lead to a further degrading of UK and global security. Remote and autonomous military systems lower the threshold for the use of armed force, making it much easier for state and non-state groups alike to engage in armed attack. Such systems encourage war as the first rather than the last option.

KEY QUESTIONS

Does the war in Ukraine really demonstrate that ‘drones are the future’?
  • It seems to be taken for granted that the ongoing war in Ukraine has demonstrated the effectiveness of drone and autonomous warfare and that therefore the UK must ‘learn the lesson’ and increase funding for such technology. However, while drones are being used extensively by both Russia and Ukraine – and causing very substantial numbers of casualties – it is far from clear that they are having any strategic impact.
  • Larger drones such as the Turkish Bayraktar TB2 operated by Ukraine – hailed as the saviour of  Ukraine at the beginning of the war  – and Russia’s Orion MALE armed drone have virtually disappeared above the battlefield as they are easily shot down. Larger one-way attack (sometimes called ‘suicide’) drones are being fired at each other’s major cities by both sides and are causing considerable harm. While these strikes are mainly for propaganda effect, again it is not clear if this will change the outcome of the war.
  • Short range surveillance/attack drones are being used very extensively on the battlefield, and the development in particular of First Person View (FPV) drones to carry out attacks on troops and vehicles has been a significant development. However, counter measures such as electronic jamming means that thousands of these drones are simply lost or crash. In many ways, drone warfare in Ukraine has become a long-term ‘cat and mouse’ fight between drones and counter-drone measures and this is only likely to continue.
Is ‘cutting edge military technology’ a silver bullet for UK Defence?
  • The capabilities of future military systems are frequently overstated and regularly underdelivered. Slick industry videos showcasing new weapons are more often than not the product of graphic designers creative imaginings rather than real world demonstrations of a new capability.
  • Click to open the briefing

    The hype surrounding trials of so-called ‘swarming drones’ is a good example. There is a world of difference between a ‘drone swarm’ in its true, techno-scientific meaning and a group of drones being deployed at the same time. A true drone swarm sees individual systems flying autonomously, communicating with each other and following a set of rules without a central controller. While manufacturers and militaries regularly claim they are testing or trialling ‘a drone swarm’, in reality they just operating a group of drones at the same time controlled by a group of operators.

  • While there have been considerable developments in the field of AI and machine learning over the past decade, the technology is still far from mature. Anyone using a chatbot, for  example, will quickly discover that there can be serious mistakes in the generated output. Trusting data generated by AI systems in a military context, without substantial human oversight and checking, is likely to result in very serious errors. The need for ongoing human oversight of AI systems is likely to render any financial of human resources saving from using AI virtually redundant.
Will funding new autonomous drones actually keep us safe?
  • Perhaps the key question about plans to heavily invest in future military AI and drone warfare is whether it will actually keep the UK safe. Just over a decade ago, armed drones were the preserve of just three states: the US, the UK and Israel. Today, many states and non-state groups are using armed drones to launch remote attacks, resulting in large numbers of civilian casualties. In essence, as they enable both states and non-state groups to engage in armed attack with little or no risk to themselves, remote and autonomous drones lower the threshold for the use of armed force, making warfare much more likely.
  • Given the global proliferation of such technology, it seems inevitable that any new developments in drone warfare funded by the UK over the next few years will inevitable proliferate and be used by other state and non-state groups. In many ways, it seems only a matter of time before drone warfare comes to the UK.
  • Rather than funding the development of new lethal autonomous drones, the UK should be at the forefront of efforts to curb and control the use of these systems, working with other states, NGOs and international experts to put in place globally accepted rules to control their proliferation and use.
Is the development and use of autonomous weapons inevitable?
  • Although the realm of science fiction until relatively recently, plans are now being developed by a number of states, including the UK, to develop and deploy lethal autonomous weapon systems. It is highly likely that the first fully autonomous weapons will be a drone-based system.
  • The real issue here is not the development of AI itself, but the way it is used. Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities. These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing AI weapons systems.
  • While some argue the inevitability of the development of these systems, there are a range of measures which could be used to prevent their development including establishing international treaties and norms, developing confidence-building measures, introducing international legal instruments, and adopting unilateral control measures. Given how much we have seen drone warfare spread and create global insecurity over the past decade, now is the time for the UK to be fully involved in international discussions to control the development of lethal fully autonomous weapon systems.

Read more

Companies vying for share of purloined aid budget as UK plans to spend big on drones and military tech

Keir Starmer visits drone factory. Credit: Reuters

While behind-the-scenes wrangling on the final details of the latest Strategic Defence Review continue, the overall message is crystal clear: the UK intends to significantly increase military spending. To enable this there have already been a number of government decisions designed to make funds available, in particular, for new weapons technology and programmes.

In November, the Defence Secretary announced he was cutting a number of ‘outdated’ military programmes (including the infamous Watchkeeper drone) to make funds available for new military technology. The Chief of the Defence Staff, Admiral Tony Radakin argued that “accelerating the disposal of legacy equipment is the logical approach to focus on the transition to new capabilities that better reflect changing technology and tactics.”

In a more ambitious money grab, PM Kier Starmer announced that he was cutting the UK’s aid budget to help increase military spending to 2.5% of GDP and said he would use the released funds to “accelerate the adoption of cutting-edge capabilities.” Starmer argued that the aid cuts would mean an extra £13.4bn military spending per year from 2027. Others, however, argued that in real terms, the increase would be around £6bn per year. Many noted that whatever the boost to UK military spending, the cuts would significantly harm the worlds poorest people.

Finally, there has been a concerted effort to ensure that banks, pension funds and other big investors – who have accepted that military companies should be excluded from ethical investment portfolios – get back in line and ensure that military companies have full access to all their funds. The government it seems, is adamant that private as well as public funds are made available to such companies. Not unrelated to this move, universities are also coming under pressure to crackdown on opposition to military company recruitment on campus.

Which drones companies are likely to benefit?

A number of newer and older military companies are likely to benefit from the coming increase in military spending and, in anticipation, we have seen a surge in the stock prices of many of the companies involved. While drones and related technology are only one part of the increase in military spending, a number of companies in this area are likely to benefit.

Helsing

Helsing is a new company set-up by three AI experts in Berlin in 2021. Its website states that it was “founded to put ethics at the core of defence technology development” and insists that ”artificial intelligence will be the key capability to keep liberal democracies from harm”

HX-2 one way attack drones stocked at Hesling factory
HX-2 one way attack drones stocked at Hesling factory

One of the company’s first products is the HX-2 attack drone. HX-2 is a meter long, electrically propelled X-wing, one-way attack drone with up to 100 km range. The company says that on-board AI enables it to resist jamming and that multiple HX-2 can be assembled into swarms. The drone has been designed to be mass-producible and Helsing announced in February 2025 that It had set-up the first of what it is calling its ‘resilience factories’ in southern Germany to mass produce 6,000 of the drones for Ukraine. Jane’s reported in December 2024 that Helsing was to set up a factory in the UK and it is highly likely that the UK will order the HX-2 drone.

Anduril

Palmer Luckey with Andruil's Fury drone
Palmer Luckey with Andruil’s Fury drone

Although a little older than Helsing, Anduril too is a relatively new player to the defence industry. Co-founded in 2017 by technology entrepreneur Palmer Luckey, the company (named after a sword in Lord of the Rings) and its co-founder have been subject to an extra-ordinary amount of adulatory media coverage.

The UK has already awarded Anduril a number of contracts including a £30m deal in March 2025 to supply the Altius 600m and Altius 700m drones to Ukraine and it too announced this week plans to open a drone factory in the UK. Anduril is one of two companies left in the competition to supply the US air force with new category of drone called Collaborative Combat Aircraft (CCA). The UK too wants to acquire these type of drones to work in conjunction with its F-35 fighter aircraft and future Tempest combat aircraft. Anduril also works closely with another US AI tech company, Palantir, in development of AI-enabled intelligence and ‘battle-management’ systems similar in vein to the Israel ‘Lavender’ and ‘Gospel’ systems. This too is an area that the UK is likely to want to fund.

BAE Systems

Image of model of BAe's new drone concept
BAES System’s latest concept model for the UK’s ‘Autonomous Collaborative Platform’

The opposite of a newcomer, BAE Systems has a long history of being the main beneficiary of UK military spending. Research by CAAT showed that between 2012 and 2023, the company had more meetings with British prime ministers than any other private company.

With a track record of being involved with the development of drones including the UK’s experimental Taranis combat drone, BAE Systems is keen to position itself at the forefront of development of uncrewed autonomous systems. It has showcased its designs for the UK’s Autonomous Collaborative Platforms (ACP) programme – the UK’s equivalent to the US Collaborative Combat Aircraft (CCA) – and it continue to trial is Phasa-35 High-altitude surveillance drones.

Alongside this, BAE has quietly bought up a number of smaller, niche military drone companies to acquire new designs and expertise from those companies – including Prismatic, Malloy Aeronautics and Callen-Lenz – and has signed an agreement with QinetiQ to collaborate on the development of drone technology.    Read more

Drone Wars urges transparency and public oversight of UK military operations at Information Tribunal

Drone Wars team at the Tribunal

Drone Wars appeared before a two-day information tribunal this week seeking to overturn the decision of the Ministry of Defence (MoD) to end the release of statistical information on the use of Reaper drones and other armed aircraft on military operations.  This information has been a crucial way for the public and parliament to have oversight of UK military action and without the data, it will be much hard to hold the UK to account.

The MoD has been responding to our Freedom of Information Act (FOIA) requests for statistical information since 2010.  In January 2023, the MoD abruptly ended the practice arguing that the information could not be provided due to exemptions provided under Section 23 (Security bodies) or Section 24 (National Security), and Section 26 (Defence) of the Act.

Section 23 and Section 24 are used as ‘alternatives’ to disguise which exemption is being relied upon, that is whether the information comes from or relates to Special Forces, the intelligence service or similar bodies, or whether it is needed to protect ‘national security’.

The hearing took place partly in open session with Drone Wars present, but it also went into closed session when we were excluded, in order to hear evidence in secret.

The MoD provided a witness statement from Group Captain Redican, Deputy Assistant Chief of Staff for Joint Air Force Component and someone with direct experience of Operation Shader in response to our appeal.  Although the full witness statement contains 50 numbered paragraphs, only 15 were visible to us, the rest was redacted.

Documents also disclosed to us in the run up to the Tribunal revealed that at an earlier stage of our appeal, the Information Commissioner had asked the MoD to provide it with evidence that disclosure of the statistics to Drone Wars had caused harm or prejudice to the UK.  The MoD wrote to the Information Commissioner:

“The information previously released cannot be directly linked to harm to UK forces in its current operating environment (predominantly Operation Shader) however it has revealed capability details for a system that is capable of use on global operations where the threat environment may be significantly different against a more sophisticated adversary.”

In his witness statement, Gp. Capt Redican stated:

“I am aware that the MoD has previously provided responses to similar requests issued by Mr Cole.  The MoD now seeks to withhold information which it was previously content to disclose.  This is due to the changing national and security context, detailed further below.” [note the following four paragraphs were redacted]

Asked to explain what he meant by the ‘changing national and security context’, Redican explained that following Russia’s invasion of Ukraine, the UK was now preparing for ‘State on State’ warfare rather than use of armed force against non-state groups.  At the same time, he went on, the situation in the Middle East had changed since the beginning of Operation Shader, with Iran – which had previously engaged in the same task of opposing ISIS – but now, “was a major actor in that theatre, and their actions are contrary to British interests.”

Drone Wars strongly argued that the statistical data that we sought was simply not capable of providing insight into ‘techniques, tactics and procedures’ at the level of detail which could cause prejudice to the UK as claimed, but instead gave a broad overview which enabled public oversight.

MoD ‘drawing a line in the sand’

Redican argued that it was not about “the specifics of information”  but that “a line in the sand had to be drawn somewhere”.  He went on “at some point we have to set a new precedent. We are going to have to begin to protect our capabilities more and more.”  Read more

Autonomous Collaborative Platforms: The UK’s New Autonomous Drones 

BAE Systems concept for Tier 2 ACP

Following on from the MoD’s Defence Drone Strategy released in February (see our report here), the RAF has now published its ‘Autonomous Collaborative Platform Strategy’ as it works to develop, produce and deploy these new type of military drones.

The strategy defines Autonomous Collaborative Platform (ACP) as types of uncrewed systems (drones) “which demonstrate autonomous behaviour and are able to operate in collaborative manner with other assets.”   The strategy argues that Reaper and the (soon-to-enter-service)  Protector drones “are vulnerable in warfighting conflicts involving peer or near-peer adversary. Therefore, as a priority the RAF needs to go beyond RPAS [Remotely Piloted Air Systems] to develop ACP capabilities.”

The plan argues that “through increasing use of autonomy, remote mission operators (commanders /supervisors) will be able to command an increasing number of AV [drones] within each ACP system.”

Underpinning the development, is the notion that the “geopolitical climate demands that we move beyond the caution of the post-cold war world” and that therefore the RAF must “undertake activity in areas that are demanding, difficult or overtly hostile.”   While the Strategy sets out a variety of tasks for these new drones, it makes clear that a key focus is on “overwhelming an adversary’s air defences.”  ACP are therefore not a defensive system, but are designed from the outset to enable the UK to engage in attack.

Tiers for Fears

The strategy sets out three ‘Tiers’ of ACP based on their ability to survive in “high-risk” (i.e. defended) environments:

  • Tier 1 ae disposable drones, with life-cycle of one or very few missions;
  • Tier 2 are “attritable” (or “risk tolerant”) that is, expected to survive but losses are acceptable;
  • Tier 3 are drones which have high strategic value, which if lost would significantly affect how the RAF will fight.

Diagram from Autonomous Collaborative Platform Strategy

Echoing the words of the Chief of the Air Staff Sir Richard Knighton before the Defence Select Committee earlier this year, the document states that a Tier 1 ACP will be operational “by the end of 2024”, while Tier 2 systems will be part of RAF combat force by 2030.  Read more