Military AI Audit: Congress scrutinises how the US is developing its warfighting AI capabilities

Click to open report

In February, the US Government Accountability Office (GAO), which audits and evaluates government activities on behalf of the US Congress, published a study examining the Department of Defense’s approach to developing and deploying artificial intelligence (AI) capabilities in weapon systems and assessing the current status of ‘war-fighting’ AI in the US military.  The GAO report gives an important insight into how the world’s most powerful military plans to use AI in combat.  It also raises a number of important ethical issues which our own Parliament should also be investigating in relation to the UK’s own military AI programmes.

The GAO study concludes that although the US Department of Defense (DoD) is “actively pursuing AI capabilities,” the majority of AI activities supporting warfighting (as opposed to undertaking business and maintenance tasks) remain at the research and development stage as DoD attempts to address the differences between ‘AI’ and traditional computer software.  Research efforts are currently focused on developing autonomy for drones and other uncrewed systems, recognizing targets, and providing recommendations to commanders on the battlefield.  Reflecting the US’ interest in military AI, the budget for the DOD’s Joint AI Center has increased dramatically from $89 million in 2019 to $278 million in 2021.  In total the Joint AI Center has spent approximately $610 million on AI programmes over the past three years, although the GAO considers that it is too soon to assess the effectiveness of this spending.  Read more

None too clever? Military applications of artificial intelligence

Drone Wars UK’s latest briefing looks at where and how artificial intelligence is currently being applied in the military context and considers the legal and ethical, operational and strategic risks posed.

Click to open

Artificial Intelligence (AI), automated decision making, and autonomous technologies have already become common in everyday life and offer immense opportunities to dramatically improve society.  Smartphones, internet search engines, AI personal assistants, and self-driving cars are among the many products and services that rely on AI to function.  However, like all technologies, AI also poses risks if it is poorly understood, unregulated, or used in inappropriate or dangerous ways.

In current AI applications, machines perform a specific task for a specific purpose.  The umbrella term ‘computational methods’ may be a better way of describing such systems, which fall far short of human intelligence but have wider problem-solving capabilities than conventional software.  Hypothetically, AI may eventually be able to perform a range of cognitive functions, respond to a wide variety of input data, and understand and solve any problem that a human brain can.  Although this is a goal of some AI research programmes, it remains a distant  prospect.

AI does not operate in isolation, but functions as a ‘backbone’ in a broader system to help the system achieve its purpose.  Users do not ‘buy’ the AI itself; they buy products and services that use AI or upgrade a legacy system with new AI technology.  Autonomous systems, which are machines able to execute a task without human input, rely on artificial intelligence computing systems to interpret information from sensors and then signal actuators, such as motors, pumps, or weapons, to cause an impact on the environment around the machine.  Read more

Reclaiming the technology juggernaut: A review of Azeem Azhar’s ‘Exponential’

  • Azeem Azhar, Exponential: How Accelerating Technology Is Leaving us Behind and What to Do About It, Cornerstone, 2021
Azeem Azhar

The central message of Azeem Azhar’s new book, ‘Exponential’, is that technology is a force that humanity can direct, rather than a force which will enslave us.  This may seem optimistic, given the alarmingly fast rate of change which new technologies are bringing about in the world, but as well as explaining in clear terms why these changes are happening so fast and why this is a problem, the book also sets out a manifesto for how we can match technology to meet human needs and begin to address some of the social impacts of rapid change.

‘Exponential’ identifies four key technology domains which form the bedrock of the global economy and where capabilities are accelerating at ever-increasing rates while, at the same time costs are plummeting.  The four technologies are computer science, where improvements are driven by faster processors and access to vast data sets; energy, where renewables are causing the price of generating power to drop rapidly; the life sciences, where gene sequencing and synthetic biology are allowing us to develop novel biological components and systems, and manufacturing, where 3D printing is enabling the rapid, localized production of anything from a concrete building to plant-based steaks.  These are all ‘general purpose technologies’: just like electricity, the printing press, and the car, they have broad utility and the potential to change just about everything.

However, while these technologies are taking off at an exponential rate, society has been unable to keep up.  Businesses, laws, markets, working patterns, and other human institutions have at the same time been able to evolve only incrementally and are struggling to adapt.  Azhar calls this the ‘exponential gap’ – the rift between the potential of the technologies and the different types of management that they demand.  Understanding the exponential gap can help explain why we are now facing technology-induced problems like market domination by ‘winner takes all’ businesses such as Amazon, the gig economy, and the spread of misinformation on social media.

The book detail the impacts of the exponential growth in technology on business and employment as well as on geopolitical issues such as trade, conflict, and the global balance of power.  It shows how the ‘exponential gap’ is shaping relations between citizens and society through the power of tech giants which increasingly provide platforms for our conversations and relationships while collecting and commodifying data about us in order to manipulate our choices. Read more

MoD report urges embrace of human augmentation to fully exploit drones and AI for warfighting

Click to open report from MoD website.

The MoD’s internal think-tank, the Development, Concepts and Doctrine Centre (DCDC) along with the German Bundeswehr Office for Defence Planning (BODP) has published a disturbing new report urging greater investigation of – and investment in – human augmentation for military purposes. The following is a brief summary of the 100+ page document with short comment at the end.

Human Augmentation – The Dawn of a New Paradigm’ argues that humans are the ‘weakest link’ in modern warfare, and that there is a need to exploit scientific advances to improve human capabilities.

“Increasing use of autonomous and unmanned systems – from the tactical to the strategic level – could significantly increase the combat effect that an individual can bring to bear, but to realise this potential, the interfaces between people and machines will need to be significantly enhanced. Human augmentation will play an important part in enabling this interface.”

Suggested human augmentation to explore for military purposes includes the use of brain interfaces, pharmaceuticals and gene therapy.  Humans, argues the report, should be seen as a ‘platform’ in the same way as vehicles, aircraft and ships, with three elements of ‘the human platform’ to be developed: the physical, the psychological and the social (see image below). Read more

The iWars Survey: Mapping the IT sector’s involvement in developing autonomous weapons

A new survey by Drone Wars has begun the process of mapping the involvement of information technology corporations in military artificial intelligence (AI) and robotics programmes, an area of rapidly increasing focus for the military.  ‘Global Britain in a Competitive Age’, the recently published integrated review of security, defence, development, and foreign policy, highlighted the key roles that new military technologies will play in the government’s vision for the future of the armed forces and aspirations for the UK to become a “science superpower”.

Although the integrated review promised large amounts of public funding and support for research in these areas, co-operation from the technology sector will be essential in delivering ‘ready to use’ equipment and systems to the military.  Senior military figures are aware that ‘Silicon Valley’ is taking the lead in  the development of autonomous systems for both civil and military use’. Speaking at a NATO-organised conference aimed at fostering links between the armed forces and the private sector, General Sir Chris Deverell, the former Commander of Joint Forces Command explained:

“The days of the military leading scientific and technological research and development have gone. The private sector is innovating at a blistering pace and it is important that we can look at developing trends and determine how they can be applied to defence and security”

The Ministry of Defence is actively cultivating technology sector partners to work on its behalf through schemes like the Defence and Security Accelerator (DASA). However, views on co-operation with the military by those within the commercial technology sector are mixed. Over the past couple of  years there are been regular reports of opposition by tech workers to their employer’s military contacts including those at Microsoft and GoogleRead more

Drone Wars at Ten #3: What’s next? A peek at the future

In this final post to mark our 10th birthday, I want to peer a little into the future, looking at what we are facing in relation to drone warfare in the coming years. Of course predicting the future is always a little foolish – perhaps especially so in the middle of a global pandemic – but four areas of work are already fairly clear: public accountability over the deployment of armed drones; the push to open UK skies to military drones;  monitoring the horizontal and vertical proliferation of military drones and opposing the development of lethal autonomous weapons, aka ‘killer robots’. Read more