UK government threatens to tax tech giants over extremism

Ben Wallace, UK security minister, said the government was spending millions on counter terror efforts because internet companies were failing to remove extremist content quick enough

The UK government is considering taxing technology companies who fail to cooperate with them on removing extremist content online quick enough
Powered by automated translation

Britain’s security minister has threatened technology firms such as Facebook, YouTube and Google with punitive taxation if they fail to cooperate with the government on fighting online extremism.

Ben Wallace said that Britain was spending hundreds of millions on human surveillance and de-radicalisation programmes because tech giants were failing to remove extremist content online quick enough.

Mr Wallace said the companies were “ruthless profiteers”, despite sitting “on beanbags in T-shirts”, who sold on details of its users to loan companies but would fail to give the same information to the government.

He said that enforcing tax measures should be looked at as an option to punish firms that do not work hard enough to remove radical content, such as Facebook-owned messaging service WhatsApp.

“Because of encryption and because of radicalisation, the cost of that is heaped on law enforcement agencies,” Mr Wallace told the Sunday Times. “I have to have more human surveillance. It’s costing hundreds of millions of pounds.

“If they [tech firms] continue to be less than co-operative, we should look at things like tax as a way of incentivising them or compensating for their inaction.

“Because content is not taken down as quickly as they could do, we’re having to de-radicalise people who have been radicalised. That’s costing millions. They [the firms] can’t get away with that and we should look at all options, including tax.”


Read more:


Facebook's policy director Simon Milner said the security minister was "wrong" to say that the company puts profit before safety.

"We’ve invested millions of pounds in people and technology to identify and remove terrorist content. The Home Secretary and her counterparts across Europe have welcomed our coordinated efforts which are having a significant impact," Mr Milner said in a statement.

"But this is an ongoing battle and we must continue to fight it together, indeed our CEO recently told our investors that in 2018 we will continue to put the safety of our community before profits.”

In response to Mr Wallace's comments, a YouTube spokesperson said: “Violent extremism is a complex problem and addressing it is a critical challenge for us all. We are committed to being part of the solution and we are doing more every day to tackle these issues.

"Over the course of 2017 we have made significant progress through investing in machine learning technology, recruiting more reviewers, building partnerships with experts and collaboration with other companies through the Global Internet Forum."

The UK government has repeatedly warned tech companies that more needed to be done to tackle online extremism.

epa06398356 (FILE) - The logo of the messaging application WhatsApp (C) is pictured on a smartphone screen with the Facebook (top L) app logo in Taipei, Taiwan, 26 September 2017 (reissued 19 December 2017). According to reports, the German cartel office on 19 December 2017 found that Facebook has abused its dominant market position. The preliminary findings suggest that Facebook's targeted advertising also uses third-party data collected from the social network's subisidiaries WhatsApp and Instagram.  EPA/RITCHIE B. TONGO *** Local Caption *** 53792247
Facebook, which owns WhatsApp, and YouTube joined Microsoft and Twitter to form the Global Internet Forum to Counter Terrorism in June. Ritchie B Tongo/ EPA

In August, Amber Rudd, the UK’s home secretary visited Silicon Valley to impress on internet companies the need to act more quickly, while prime minister Theresa May said terrorist content should be removed from the web within two hours in a speech in September.

Earlier this month, Google-owned YouTube said it would be increasing the number of teams which would identify and remove unsuitable footage from its channels in 2018.

YouTube said that nearly 70 per cent of violent extremist content was removed within 8 hours of it being uploaded and the company was working to accelerate that speed.

However, research by the UK government suggests that three quarters of ISIL propaganda is viewed within three hours of being uploaded to online platforms, reaching its target audience long before authorities could react.

In June, Facebook, YouTube, Microsoft and Twitter formed the Global Internet Forum to Counter Terrorism aimed at cooperating to deal with the spread of online radicalisation.

Pressure on tech companies from the UK government has increased in 2017, after five terror attacks in London and Manchester left 36 people dead and hundreds more injured.