Drone footage shows ‘manifestly unlawful’ US strike on civilians; Trump vows to rip-up drone treaty

The US killed 11 people in a reported drone strike on a small boat in the Caribbean Sea on 3 September. Although it has not been confirmed that the strike was carried out by a drone, President Trump shared drone footage of the strike on his social media. In August it was revealed that Trump had secretly signed a directive ordering the Pentagon to begin military  operations against drug cartels.

Screen grab from drone video shared by President Trump.

While US officials alleged that the boat targeted was carrying drugs being transported by members of the Tren de Aragua cartel, multiple legal scholars and experts have argued that the strike was “manifestly unlawful.”

Professor Luke Moffett of Queens University Belfast told the BBC that while “force can be used to stop a boat, generally this should be non-lethal measures.” Any use of force must be “reasonable and necessary in self-defence where there is immediate threat of serious injury or loss of life to enforcement officials.”  The US and other states regularly stop boats in international waters as part of law enforcement activity without resorting to the use of lethal force.   

Much more significantly, however, is the grave violation of international law that is deliberate, premeditated targeting of civilians. Claire Finkelstein, professor of national security law at the University of Pennsylvania, said “There’s no authority for this whatsoever under international law. It was not an act of self-defense. It was not in the middle of a war. There was no imminent threat to the United States.”  Finklestein went on to make the clear and obvious connection between the strike and the on-going, two-decades long US drone targeted killing programme which has significantly blurred the lines between law enforcement and armed conflict.

While the US alleges that the occupants of the boat were members of an organised criminal gang and President Trump and other administration officials have began to publicly talk about the threat of ‘Narco terrorists’, that in no way makes the targets of this strike combatants under the laws of war.  While civilians are regularly and persistently victims of  drone and air strikes, the deliberate targeting of non-combatants is still shocking.

New York University law professor Ryan Goodman, who previously worked as a lawyer in Pentagon, told the New York Times that “It’s difficult to imagine how any lawyers inside the Pentagon could have arrived at a conclusion that this was legal rather than the very definition of murder under international law rules that the Defense Department has long accepted.”

In the aftermath of the strike and questioning by the media, administration officials struggled to justify the legality of the strike, resorting to arguing that it was a matter of self-defence. Significantly, senior officials said that further such operations were likely

Trump and the MTCR

Meanwhile, President Trump is reportedly returning to a plan formulated during his first administration to overturn controls on the export of US armed drones. Trump attempted in 2020, as we reported, to get the other state signatories of the Missile Technology Control Regime (MTCR) to accept that Predator/Reaper-type drones should be moved out of the most strongly controlled group (Category I) into the lesser group (Category II). Other states, however, gave this short shrift, much to Trumps annoyance.     

According to the Reuters report, the new move involves “designating drones as aircraft… rather than missile systems”  which will enable the US to then “sidestep” its treaty obligations. The move will aid US plans to sell hundreds of armed drones to Saudi Arabia, UAE and Qatar.  

Whether this will convince other states is highly doubtful, but it is likely that Trump and his administration will not care. Such a move will of course open the flood gates for other states to unilaterally reinterpret arms control treaties in their favour in the same way and will also likely spur the proliferation of armed drones which will only further increase civilian harm.  

UK crossing the line as it implements use of AI for lethal targeting under Project Asgard

Despite grave ethical and legal concerns about the introduction of AI into decision making around the use of lethal force, the UK is rapidly pressing ahead with a number of programmes and projects to do so, with the British Army recently trialling a new AI-enabled targeting system called ASGARD as part of a NATO exercise in Estonia in May 2025.

A mock HQ utilising ASGARD at MoD briefing, July 2025. Crown Copyright 2025.

Last week, the Ministry of Defence (MoD) gave a briefing to selected media and industry ‘partners’ on Project ASGARD – which it describes as the UK’s programme to “double the lethality” of the British Army through the use of AI and other technology. ASGARD is not aimed at  producing or procuring a particular piece of equipment but rather at developing a communications and decision-making network that uses AI and other technology to vastly increase the speed of undertaking lethal strikes.

ASGARD is part of a £1 billion ‘Digital Targeting Web’ designed to “connect sensors, shooters, and decision-makers” across the land, sea, air, and space domains. “This is the future of warfare,” Maria Eagle, Minister for Defence Procurement and Industry told the gathering. 

According to one reporter present at the briefing, the prototype network “used AI-powered fire control software, low-latency tactical networks, and semi-autonomous target recommendation tools.” 

Janes reported that through ASGARD, “any sensor”, whether it be an unmanned aircraft system (UAS), radar, or human eye, is enabled by AI to identify and prioritise targets and then suggest weapons for destroying them. “Before Asgard it might take hours or even days. Now it takes seconds or minutes to complete the digital targeting chain,” Sir Roly Walker, Head of the British Army told the gathering.

Drones used in conjunction with ASGARD
DART 250EW one-way attack drone
Helsing HX-2 one-way attack drone

While the system currently has a ‘human in the loop’  officials suggested that this could change in future, with The I Paper reporting ‘the system is technically capable of running without human oversight and insiders did not rule out allowing the AI to operate independently if ethical and legal considerations changed.’

How it works

A British Army report after the media event suggested that  “Asgard has introduced three new ways of fighting designed to find, strike and blunt enemy manoeuvre: 

  • A dismounted data system for use at company group and below.
  • The introduction of the DART 250 One Way Effector. This enables the targeting of enemy infrastructure three times further than the current UK land based deep fires rockets.
  • A mission support network to accelerate what is called the digital targeting or ‘kill’ chain.

According to a detailed and useful write-up of the Estonia exercise, ASGARD uses existing equipment currently in service alongside new systems including Lattice command and control software from Anduril which provides a ‘mesh network’ for communications, as well as Altra and Altra Strike software from Helsing used to identify and ‘fingerprint’ targets. The report goes on:

“targets were passed to PRIISM which would conduct further development including legal review, collateral damage estimates, and weapon-to-target matching.”  

Helsing’s HX-2 drone was also used during the exercise and is another indication that the UK is likely to acquire these one-way attack drones. DART 250, a UK manufactured jet-powered one-way attack drone with a range of 250 km that can fly at more than 400 km/h was also deployed as part of the exercise. The manufacturer says that it can fly accurately even when GPS signals are jammed and that it is fitted with seeker that enables it to home-in and destroy jamming equipment.  

AI: speed eroding oversight and accountability

The grave dangers of introducing AI into warfare, and in particular for the use of force are by now well known.  While arguments have been made for and against these systems for more than a decade, increasing we are moving from a theoretical, future possibility to the real world: here, now, today.

While some argue almost irrationally in the powers and benefits of AI, in the real world AI-enabled systems remain error prone and unreliable. AI is far from fallible and relies on training data which time and time again have led to serious mistakes through bias.  

Systems like ASGARD may be able to locate tanks on an open plain in a well-controlled training exercise environment (see video above), the real world is very different.  Most armed conflicts do not take place in remote battlefields but in complex and complicated urban environments.  Relying on AI to choose military targets in such a scenario is fraught with danger.

Advocates of ASGARD and similar systems argue that the ‘need’ for speed in targeting decisions means that the use of AI brings enormous benefits.  And it is undoubtedly true that algorithms can process data much faster than humans. But speeding up such targeting decisions significantly erodes human oversight and accountability.  Humans in such circumstances are reduced to merely rubber-stamping the output of the machine.

Meanwhile, the Ministry of Defence confirmed that the next phase of ASGARD’s development has received government funding while at the UN, the UK continues to oppose the negotiation of a new legally binding instrument on autonomous weapons systems.

The UK’s forgotten war: British drone strikes continue against ISIS

Three weeks ago, on June 10, a British Reaper drone began tracking a motorcycle in north-western Syria near the border with Turkey as it began to be ridden by a someone described by British intelligence as “a known member” of ISIS. The individual, who had apparently been monitored by the drone “for some time” was tracked and killed by a Hellfire missile fired by the drone a short while later. 

Aftermath of UK drone strike in NW Syria, Jun 10 2025 : Image credit : The White Helmets

Local reports from the ground said the man was killed in the blast, with another person also injured and taken to hospital. This was the second British drone strike in north-west Syria this year and the only reason we know about it was a MoD spokesperson boasted about it to The Sun this weekend.

A forgotten, fitful war

For most, the US/UK war against ISIS in Iraq and Syria has been virtually forgotten   Other  awful conflicts –  in Ukraine, Gaza and Sudan –  have taken our attention over the past two years, not to mention the more recent unlawful Israeli and the US bombing of Iran. And in many ways this is understandable.  Russia’s illegal invasion of Ukraine and Hamas’ attack followed by Israel’s on-going genocidal war on Gaza has stunned the world.   

Yet, it should still matter  – particular to British public, media and parliamentarians  – that British forces continue to engage in a seemingly never ending, fitful war in Syria and Iraq.

MoD secrecy 

In addition, the war gets little attention because the Ministry of Defence (MoD) has decided it will no longer talk about ongoing UK military operations. After a decade of responding to our Freedom of Information (FoI) requests on the UK’s use of Reaper drones, for example, the MoD abruptly began to refuse them at the beginning of 2023 arguing that the changed global situation mean that oversight and transparency had to be curbed. Other organisations, journalists and parliamentary committees too have seen a decline in transparency from the MoD, both about UK military operations but also about UK military developments in general.

While MoD has argued that the ‘geopolitical situation’ means they have to be much more ‘circumspect’, the significant drop in the ability of the media, parliament and the public to scrutinise the MoD and hold the armed forces to account will no doubt be welcomed by them for a variety of reasons.   

Read more

The Strategic Defence Review and Drone Warfare: Questioning a Dangerous Consensus

While there appears to be a consensus between mainstream political parties, officials and defence commentators that a significant increase in spending on drone and military AI systems would be a positive development, there are serious questions about the basis on which this decision is being made and the likely impact on global security.

New military technology in general, and uncrewed systems in particular, are being presented by politicians and the media as a quick and simple, cost-effective way for the armed forces to increase ‘mass’ and ‘lethality’ without having to procure hugely expensive kit that can take years to produce. Drones are also seen as an alternative to deploying troops in significant numbers at a time when recruitment has become increasingly difficult.

However, far from aiding security, increased spending on drones, autonomous weapons and other emerging military technology will simply lead to a further degrading of UK and global security. Remote and autonomous military systems lower the threshold for the use of armed force, making it much easier for state and non-state groups alike to engage in armed attack. Such systems encourage war as the first rather than the last option.

KEY QUESTIONS

Does the war in Ukraine really demonstrate that ‘drones are the future’?
  • It seems to be taken for granted that the ongoing war in Ukraine has demonstrated the effectiveness of drone and autonomous warfare and that therefore the UK must ‘learn the lesson’ and increase funding for such technology. However, while drones are being used extensively by both Russia and Ukraine – and causing very substantial numbers of casualties – it is far from clear that they are having any strategic impact.
  • Larger drones such as the Turkish Bayraktar TB2 operated by Ukraine – hailed as the saviour of  Ukraine at the beginning of the war  – and Russia’s Orion MALE armed drone have virtually disappeared above the battlefield as they are easily shot down. Larger one-way attack (sometimes called ‘suicide’) drones are being fired at each other’s major cities by both sides and are causing considerable harm. While these strikes are mainly for propaganda effect, again it is not clear if this will change the outcome of the war.
  • Short range surveillance/attack drones are being used very extensively on the battlefield, and the development in particular of First Person View (FPV) drones to carry out attacks on troops and vehicles has been a significant development. However, counter measures such as electronic jamming means that thousands of these drones are simply lost or crash. In many ways, drone warfare in Ukraine has become a long-term ‘cat and mouse’ fight between drones and counter-drone measures and this is only likely to continue.
Is ‘cutting edge military technology’ a silver bullet for UK Defence?
  • The capabilities of future military systems are frequently overstated and regularly underdelivered. Slick industry videos showcasing new weapons are more often than not the product of graphic designers creative imaginings rather than real world demonstrations of a new capability.
  • Click to open the briefing

    The hype surrounding trials of so-called ‘swarming drones’ is a good example. There is a world of difference between a ‘drone swarm’ in its true, techno-scientific meaning and a group of drones being deployed at the same time. A true drone swarm sees individual systems flying autonomously, communicating with each other and following a set of rules without a central controller. While manufacturers and militaries regularly claim they are testing or trialling ‘a drone swarm’, in reality they just operating a group of drones at the same time controlled by a group of operators.

  • While there have been considerable developments in the field of AI and machine learning over the past decade, the technology is still far from mature. Anyone using a chatbot, for  example, will quickly discover that there can be serious mistakes in the generated output. Trusting data generated by AI systems in a military context, without substantial human oversight and checking, is likely to result in very serious errors. The need for ongoing human oversight of AI systems is likely to render any financial of human resources saving from using AI virtually redundant.
Will funding new autonomous drones actually keep us safe?
  • Perhaps the key question about plans to heavily invest in future military AI and drone warfare is whether it will actually keep the UK safe. Just over a decade ago, armed drones were the preserve of just three states: the US, the UK and Israel. Today, many states and non-state groups are using armed drones to launch remote attacks, resulting in large numbers of civilian casualties. In essence, as they enable both states and non-state groups to engage in armed attack with little or no risk to themselves, remote and autonomous drones lower the threshold for the use of armed force, making warfare much more likely.
  • Given the global proliferation of such technology, it seems inevitable that any new developments in drone warfare funded by the UK over the next few years will inevitable proliferate and be used by other state and non-state groups. In many ways, it seems only a matter of time before drone warfare comes to the UK.
  • Rather than funding the development of new lethal autonomous drones, the UK should be at the forefront of efforts to curb and control the use of these systems, working with other states, NGOs and international experts to put in place globally accepted rules to control their proliferation and use.
Is the development and use of autonomous weapons inevitable?
  • Although the realm of science fiction until relatively recently, plans are now being developed by a number of states, including the UK, to develop and deploy lethal autonomous weapon systems. It is highly likely that the first fully autonomous weapons will be a drone-based system.
  • The real issue here is not the development of AI itself, but the way it is used. Autonomy raises a wide range of ethical, legal, moral and political issues relating to human judgement, intentions, and responsibilities. These questions remain largely unresolved and there should therefore be deep disquiet about the rapid advance towards developing AI weapons systems.
  • While some argue the inevitability of the development of these systems, there are a range of measures which could be used to prevent their development including establishing international treaties and norms, developing confidence-building measures, introducing international legal instruments, and adopting unilateral control measures. Given how much we have seen drone warfare spread and create global insecurity over the past decade, now is the time for the UK to be fully involved in international discussions to control the development of lethal fully autonomous weapon systems.

Read more

RAF’s new armed drone given approval to fly freely over UK  

Protector RG1 flying over RAF Waddington. Crown Copyright.

The UK’s Military Aviation Authority (MAA) has issued ‘Military Type Certification’ to the UK’s new ‘Protector’ armed drone, meaning that it is now free to fly within UK airspace, including over populated areas.

Previously, for safety reasons, Protector and other large uncrewed systems such as the Protector’s predecessor, the Reaper, were only allowed to fly in segregated airspace, with other aircraft excluded.  Although large military drones are spreading rapidly, as Drone Wars has documented they continue to tumble out of the skies for a whole variety of reasons.

The UK is the first country to certify a large drone to fly freely in unsegregated airspace and General Atomics, the manufacturer of the drone – which they call MQ-9B SkyGuardian rather than UK designation of ‘Protector RG1’– were delighted as it has huge implications for their sales.  The company’s press release called it “a seminal achievement.”  A key element of the  approval, alongside “rigorous testing”, was apparently the ‘rigid separation’ of mission software from flight critical software.

Protector flights in the UK

The Protector has been undertaking a short series of test flights around RAF Waddington, the home of UK drone warfare, over the past few weeks. The Aviationist noted two tests in the past week which were of the longest duration so far, including one which saw the drone fly to RAF Marham before taking off and returning to Waddington.  RAF Marham is the nominated diversion airfield for the drone.

General Atomics reported that 10 of the 16 Protector drones ordered had now been delivered to the UK but it is not clear if these are all at RAF Waddington  as previous drones that have ‘been delivered’ to the RAF remained in the US for testing and trials.  The UK is increasingly secretive about its drone operations and exact details about when Protector is to come into service have been given vaguely as ‘by the end of 2025’.  Reaper is also expected to exit service by the end of the year.

Protector test and training flights are now likely to expand both in number and in range, including flights to launch weapons at Holbeach Air Weapons Range, near Boston in The Wash. Protector carries the Paveway IV guided bomb and Brimstone 3 missiles.

The Ministry of Defence has always been clear that Protector will also be available to support counter-terrorism operations within the UK and undertake Military Aid to Civilian Authorities (MACA) tasks such as assisting HM Coastguard with search and rescue missions.  Read more

Companies vying for share of purloined aid budget as UK plans to spend big on drones and military tech

Keir Starmer visits drone factory. Credit: Reuters

While behind-the-scenes wrangling on the final details of the latest Strategic Defence Review continue, the overall message is crystal clear: the UK intends to significantly increase military spending. To enable this there have already been a number of government decisions designed to make funds available, in particular, for new weapons technology and programmes.

In November, the Defence Secretary announced he was cutting a number of ‘outdated’ military programmes (including the infamous Watchkeeper drone) to make funds available for new military technology. The Chief of the Defence Staff, Admiral Tony Radakin argued that “accelerating the disposal of legacy equipment is the logical approach to focus on the transition to new capabilities that better reflect changing technology and tactics.”

In a more ambitious money grab, PM Kier Starmer announced that he was cutting the UK’s aid budget to help increase military spending to 2.5% of GDP and said he would use the released funds to “accelerate the adoption of cutting-edge capabilities.” Starmer argued that the aid cuts would mean an extra £13.4bn military spending per year from 2027. Others, however, argued that in real terms, the increase would be around £6bn per year. Many noted that whatever the boost to UK military spending, the cuts would significantly harm the worlds poorest people.

Finally, there has been a concerted effort to ensure that banks, pension funds and other big investors – who have accepted that military companies should be excluded from ethical investment portfolios – get back in line and ensure that military companies have full access to all their funds. The government it seems, is adamant that private as well as public funds are made available to such companies. Not unrelated to this move, universities are also coming under pressure to crackdown on opposition to military company recruitment on campus.

Which drones companies are likely to benefit?

A number of newer and older military companies are likely to benefit from the coming increase in military spending and, in anticipation, we have seen a surge in the stock prices of many of the companies involved. While drones and related technology are only one part of the increase in military spending, a number of companies in this area are likely to benefit.

Helsing

Helsing is a new company set-up by three AI experts in Berlin in 2021. Its website states that it was “founded to put ethics at the core of defence technology development” and insists that ”artificial intelligence will be the key capability to keep liberal democracies from harm”

HX-2 one way attack drones stocked at Hesling factory
HX-2 one way attack drones stocked at Hesling factory

One of the company’s first products is the HX-2 attack drone. HX-2 is a meter long, electrically propelled X-wing, one-way attack drone with up to 100 km range. The company says that on-board AI enables it to resist jamming and that multiple HX-2 can be assembled into swarms. The drone has been designed to be mass-producible and Helsing announced in February 2025 that It had set-up the first of what it is calling its ‘resilience factories’ in southern Germany to mass produce 6,000 of the drones for Ukraine. Jane’s reported in December 2024 that Helsing was to set up a factory in the UK and it is highly likely that the UK will order the HX-2 drone.

Anduril

Palmer Luckey with Andruil's Fury drone
Palmer Luckey with Andruil’s Fury drone

Although a little older than Helsing, Anduril too is a relatively new player to the defence industry. Co-founded in 2017 by technology entrepreneur Palmer Luckey, the company (named after a sword in Lord of the Rings) and its co-founder have been subject to an extra-ordinary amount of adulatory media coverage.

The UK has already awarded Anduril a number of contracts including a £30m deal in March 2025 to supply the Altius 600m and Altius 700m drones to Ukraine and it too announced this week plans to open a drone factory in the UK. Anduril is one of two companies left in the competition to supply the US air force with new category of drone called Collaborative Combat Aircraft (CCA). The UK too wants to acquire these type of drones to work in conjunction with its F-35 fighter aircraft and future Tempest combat aircraft. Anduril also works closely with another US AI tech company, Palantir, in development of AI-enabled intelligence and ‘battle-management’ systems similar in vein to the Israel ‘Lavender’ and ‘Gospel’ systems. This too is an area that the UK is likely to want to fund.

BAE Systems

Image of model of BAe's new drone concept
BAES System’s latest concept model for the UK’s ‘Autonomous Collaborative Platform’

The opposite of a newcomer, BAE Systems has a long history of being the main beneficiary of UK military spending. Research by CAAT showed that between 2012 and 2023, the company had more meetings with British prime ministers than any other private company.

With a track record of being involved with the development of drones including the UK’s experimental Taranis combat drone, BAE Systems is keen to position itself at the forefront of development of uncrewed autonomous systems. It has showcased its designs for the UK’s Autonomous Collaborative Platforms (ACP) programme – the UK’s equivalent to the US Collaborative Combat Aircraft (CCA) – and it continue to trial is Phasa-35 High-altitude surveillance drones.

Alongside this, BAE has quietly bought up a number of smaller, niche military drone companies to acquire new designs and expertise from those companies – including Prismatic, Malloy Aeronautics and Callen-Lenz – and has signed an agreement with QinetiQ to collaborate on the development of drone technology.    Read more