Artificial intelligence is transforming the way war is waged in the 21st century and further integration of such technology is needed by militaries, a report has suggested.
Experts at the Royal United Services Institute (Rusi) and QinetiQ, a security and defence contractor, said military commanders in the future should be ready to work with AI to better respond to crises.
The report laid out how trust in AI among military officials, political leaders and the public is essential for it to flourish in the defence arena.
The authors argued that the natural instinct of human operators is to intervene “even in situations where the AI is better placed to make a decision”.
“AI is already transforming warfare and challenging long-standing human habits,” the report stated.
“By embracing greater experimentation in training and exercises, and by exploring alternative models for C2 [command and control], defence can better prepare for the inevitable change that lies ahead.
“The traditional response to the acceptance challenge posed by the military use of AI has been to insist on humans maintaining ‘meaningful human control’ as a way of engendering confidence and trust.
“This is no longer an adequate response when considering both the ubiquity and rapid advances of AI and related underpinning technologies.
“AI will play an essential, growing role in a broad range of command and control (C2) activities across the whole spectrum of operations.
Having to make lightning-fast decisions is part of the job of a military commander, and the use of AI at such crucial times would help rather than hinder the individual, the authors said.
Earlier this week the UK's new army chief said British troops must be prepared to fight in Europe.
Alternative models for command and control could also be explored if AI is widely adopted within a military, the report suggested.
The report touched on the lack of trust humans are conditioned to have in machine-generated intellect, arguing “a careful balance” between trust in and control over AI operations is needed for it to be successfully used.
“Under-trusting can be as risky or counterproductive as over-trusting,” the report said. “In fact, just as absolute control is rare, so is absolute trust.”
Co-author Christina Balis, QinetiQ campaign director for training and mission rehearsal, said: “The growing military use of AI for operations and missions support will transform the character of warfare.
“This is not just a question of adapting our armed forces’ tactics; we need to fundamentally rethink the role of humans in future military decision-making across the spectrum of ‘operate’ and ‘war fight’ and reform the institutions and teams within which they operate. It requires that we rethink the notion of trust in human-machine decision-making”.