EU parliament passes strict rules for removal of online terror content

Seventy MEPs abstained from the vote

A man holds a smart phone with the icons for the social networking apps Facebook, Instagram and Twitter seen on the screen in Moscow on March 23, 2018.
A public apology by Facebook chief Mark Zuckerberg, on March 22, 2018 failed to quell outrage over the hijacking of personal data from millions of people, as critics demanded the social media giant go much further to protect privacy. / AFP PHOTO / Kirill KUDRYAVTSEV
Powered by automated translation

The European Parliament has voted to compel social media companies to remove terrorist content within an hour of being ordered to do so.

The bill, which passed on Wednesday evening, requires social media companies to remove or block access to the content for EU users.

Companies who fail to remove content in time could be fined up to four per cent of their global turnover.

The new legislation defines terrorist content as that which “incites or solicits the commission or contribution to the commission of terrorist offences, provides instructions for the commission of such offences or solicits the participation in activities of a terrorist group”. Additionally, content providing guidance on how to make and use explosives, firearms and other weapons for terrorist purposes will be banned.

Assistance will be given to smaller social media firms to deal with requests. They will be given information on procedures and deadlines at least 12 hours before issuing the first order to remove content that they are hosting.

The bill passed by 308 - 204 with  70 abstentions. The large number of abstentions points to the wrangling lawmakers undertook to pass the controversial bill, which walks a fine line between freedom of speech and the need to see these materials as part of academic and journalistic endeavours.

"There is clearly a problem with terrorist material circulating unchecked on the internet for too long. This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively,” said MEP Daniel Dalton.

“Any new legislation must be practical and proportionate if we are to safeguard free speech. Without a fair process, there is a risk that too much content would be removed, as businesses would understandably take a ‘safety first’ approach to defend themselves. It also absolutely cannot lead to a general monitoring of content by the back door."

The Counter Extremism Project welcomed the shift from self-policing to regulation, praising the bill’s “tangible punishments” for those who flout the rules.

“In financial punishment for the removal of harmful terrorist propaganda, these service providers can be held accountable for their role in the spread of terror content,” said CEP Executive Director David Ibsen.

“Such regulation represents a positive start in the fight against digital radicalisation, propaganda, and recruitment, which have long played havoc with Europe’s digital users.”