Britain does not understand its AI 'allies or enemies'

Obsessive secrecy among security officials also hampers ability to develop dual use technology, conference told

Britain is failing to fully understand the AI progress being made by its allies and adversaries, a leading UK scientist has said. Reuters
Powered by automated translation

Britain has a low understanding of the artificial intelligence used by its allies and its enemies, an influential think tank has been told.

State secrecy over AI capabilities was also hampering the ability of scientists without the right security clearance to use the technology, a Policy Exchange conference heard.

While other countries such as the US, the UAE and China were developing AI at high speed, Britain was lacking a “critical thing” which was “high-quality scientific intelligence”, said Sir Anthony Finkelstein, the government’s former chief scientific adviser for national security.

“Our understanding of our competitors and adversaries’ science and technology base is inadequate,” he told the debate on Artificial Intelligence, Technologies and National Security. “Our existing mechanisms are not are not fit for purpose.”

Unlike other countries, Britain was not “scaled” for tackling the opportunities given by AI and this needed to be addressed as “an immediate action”.

This was being further hampered by the obsessive secrecy surrounding certain high-level projects that denied developers access to top-grade information, said David Willetts, the former UK government minister for science.

These were “tricky issues” that needed to be resolved, foremost of which was getting experts from defence and security together with civilians in meetings.

But this was being hampered by many of the gatherings only available to those people with “relatively high-security classifications” and a number of civilian scientists had been told they could not attend them.

“This is a dual-use technology but they are not allowed to find out exactly what the dual use is,” said Mr Willetts.

“One of the urgent programmes under way at the moment is that we need to get greater security clearance for more of the civil experts so at least they can participate in these discussions,” he said. “And not just be presented with arguments, which basically say, ‘if you knew what we knew you would understand why this is a problem’ which is so frustrating and is a way of stopping rational discussion linking defence and civilians.”

He added that a number of AI research and development projects were also being held back by people demanding extra requirements and analysis.

“You'll be surprised at the number of what were previously simple commercial public R&D spending decisions now get caught up in an extra six months of analysis,” he said.

“Which is not always relevant if you're just trying to ensure the investment comes to Britain instead of going to France or Israel. So the requirements thinking can slow things up,” he said.

Updated: January 09, 2024, 8:32 AM