Military AI Audit: Congress scrutinises how the US is developing its warfighting AI capabilities

Click to open report

In February, the US Government Accountability Office (GAO), which audits and evaluates government activities on behalf of the US Congress, published a study examining the Department of Defense’s approach to developing and deploying artificial intelligence (AI) capabilities in weapon systems and assessing the current status of ‘war-fighting’ AI in the US military.  The GAO report gives an important insight into how the world’s most powerful military plans to use AI in combat.  It also raises a number of important ethical issues which our own Parliament should also be investigating in relation to the UK’s own military AI programmes.

The GAO study concludes that although the US Department of Defense (DoD) is “actively pursuing AI capabilities,” the majority of AI activities supporting warfighting (as opposed to undertaking business and maintenance tasks) remain at the research and development stage as DoD attempts to address the differences between ‘AI’ and traditional computer software.  Research efforts are currently focused on developing autonomy for drones and other uncrewed systems, recognizing targets, and providing recommendations to commanders on the battlefield.  Reflecting the US’ interest in military AI, the budget for the DOD’s Joint AI Center has increased dramatically from $89 million in 2019 to $278 million in 2021.  In total the Joint AI Center has spent approximately $610 million on AI programmes over the past three years, although the GAO considers that it is too soon to assess the effectiveness of this spending.  Read more

MoD report urges embrace of human augmentation to fully exploit drones and AI for warfighting

Click to open report from MoD website.

The MoD’s internal think-tank, the Development, Concepts and Doctrine Centre (DCDC) along with the German Bundeswehr Office for Defence Planning (BODP) has published a disturbing new report urging greater investigation of – and investment in – human augmentation for military purposes. The following is a brief summary of the 100+ page document with short comment at the end.

Human Augmentation – The Dawn of a New Paradigm’ argues that humans are the ‘weakest link’ in modern warfare, and that there is a need to exploit scientific advances to improve human capabilities.

“Increasing use of autonomous and unmanned systems – from the tactical to the strategic level – could significantly increase the combat effect that an individual can bring to bear, but to realise this potential, the interfaces between people and machines will need to be significantly enhanced. Human augmentation will play an important part in enabling this interface.”

Suggested human augmentation to explore for military purposes includes the use of brain interfaces, pharmaceuticals and gene therapy.  Humans, argues the report, should be seen as a ‘platform’ in the same way as vehicles, aircraft and ships, with three elements of ‘the human platform’ to be developed: the physical, the psychological and the social (see image below). Read more

The iWars Survey: Mapping the IT sector’s involvement in developing autonomous weapons

A new survey by Drone Wars has begun the process of mapping the involvement of information technology corporations in military artificial intelligence (AI) and robotics programmes, an area of rapidly increasing focus for the military.  ‘Global Britain in a Competitive Age’, the recently published integrated review of security, defence, development, and foreign policy, highlighted the key roles that new military technologies will play in the government’s vision for the future of the armed forces and aspirations for the UK to become a “science superpower”.

Although the integrated review promised large amounts of public funding and support for research in these areas, co-operation from the technology sector will be essential in delivering ‘ready to use’ equipment and systems to the military.  Senior military figures are aware that ‘Silicon Valley’ is taking the lead in  the development of autonomous systems for both civil and military use’. Speaking at a NATO-organised conference aimed at fostering links between the armed forces and the private sector, General Sir Chris Deverell, the former Commander of Joint Forces Command explained:

“The days of the military leading scientific and technological research and development have gone. The private sector is innovating at a blistering pace and it is important that we can look at developing trends and determine how they can be applied to defence and security”

The Ministry of Defence is actively cultivating technology sector partners to work on its behalf through schemes like the Defence and Security Accelerator (DASA). However, views on co-operation with the military by those within the commercial technology sector are mixed. Over the past couple of  years there are been regular reports of opposition by tech workers to their employer’s military contacts including those at Microsoft and GoogleRead more

Humans First: A Manifesto for the Age of Robotics. A review of Frank Pasquale’s ‘New Laws of Robotics’

In 2018, the hashtag #ThankGodIGraduatedAlready began trending on China’s Weibo social media platform.  The tag reflected concerns among Chinese students that schools had begun to install the ‘Class Care System’, developed by the Chinese technology company Hanwang.  Cameras monitor pupils’ facial expressions with deep learning algorithms identifying each student, and then classifying their behaviour into various categories – “focused”, “listening”, “writing”, “answering questions”, “distracted”, or “sleeping”. Even in a country where mass surveillance is common, students reacted with outrage.

There are many technological, legal, and ethical barriers to overcome before machine learning can be widely deployed in such ways but China, in its push to overtake the US as world’s leader in artificial intelligence (AI), is racing ahead to introduce such technology before addressing these concerns.  And China is not the only culprit.

Frank Pasquale’s book ‘The New Laws of Robotics: Defending Human Expertise in the Age of AI’ investigates the rapidly advancing use of AI and intelligent machines in an era of automation, and uses a wide range of examples – among which the ‘Class Care System’ is far from the most sinister – to highlight the threats that the rush to robotics poses for human societies.  In a world dominated by corporations and governments with a disposition for centralising control, the adoption of AI is being driven by the dictates of neoliberal capitalism, with the twin aims of increasing profit for the private sector and cutting costs in the public sector.  Read more

Intervention ‘without the need to consider the human cost’: MoD thinking on UK’s new drone revealed

Documents obtained by Drone Wars using the Freedom of Information Act (FOI) reveal how British military officials view the UK’s next generation armed drone, known as Protector, and the types of advanced capabilities the aircraft will have. Protector, which is set to replace the UK’s current fleet of armed Reaper drones in the mid-2020s, is essentially SkyGuardian—the latest version of the Predator drone being produced by General Atomics—plus UK modifications. The modifications revealed in the FOI documents (comprising presentations given by UK military personnel at a drone technology conference held last September) are significant because they provide an insight into how the Ministry of Defence (MOD) plan to utilise Protector.  Looking more widely, Protector epitomises the second drone age, characterised by a global expansion in both the type of drones being used by states and the scale of operations, including in the domestic sphere. Read more

Drone Strikes in Popular Culture: Eye in the Sky

Examining how popular culture discusses and presents drone warfare is increasingly important today, as public understanding of drone warfare is developed through movies, novels, TV and other cultural forms as much as it is through more traditional news media. Popular culture representation of drone warfare helps to circulate and amplify political ideas about what drones are, how drones are used, and what is ethically and politically at stake.

Take, for example, Gavin Hood’s 2015 film Eye in the Sky, in which civilian and military authorities disagree over the ethics of authorizing a drone strike against an al-Shabab cell planning an imminent suicide attack. Eye in the Sky’s ethical debate is structurally analogous to the ticking bomb scenario, a misleading yet very popular narrative which articulates a defence of extreme violence in ‘emergency’ conditions. As a consequence, the movie frames the moral quandaries of drone warfare in such a way that on the one hand, a Hellfire strike seems to be a simple military necessity and, on the other hand, many of the most important and controversial aspects of drone warfare are left unexplored. Read more