UK tells social media giants to get tougher on protecting children

Online Safety Bill will check age limits but will not force big technology platforms to take down material deemed 'legal but harmful'

Molly Russell, who took her own life in November 2017 after she had been viewing material on social media linked to anxiety, depression, self-harm and suicide. PA
Powered by automated translation

Social media firms face fines if they fail to block under-age users from accessing content under a new-look UK bill which also focuses on reducing sexist and racist content.

The updates will require tech firms to show how they enforce user age limits, publish summaries of risk assessments of potential harm to children on their sites and declare details of enforcement action taken against them by Ofcom — the new regulator for the sector.

However, the changes have led to accusations the plans have been thinned out.

Controversial measures that would have forced social media sites to take down material designated “legal but harmful” are to be removed.

A previous version of the bill would have forced the biggest platforms to not only remove illegal content, but also material named in the legislation as legal but potentially harmful.

These measures drew criticism from free speech campaigners, who claimed that governments or tech platforms could use the bill to censor certain content.

The amended bill requires platforms to remove illegal content and take down material in breach of its own terms of service.

Instead of the legal but harmful requirement, there will now be a greater onus on firms to provide adults with tools to hide certain content they do not wish to see. This includes such content as the glorification of eating disorders, misogyny and some other forms of abuse.

It is an approach which the UK government is calling a “triple shield” of online protection which also allows for freedom of speech.

Under the bill, social media companies could also face being fined by Ofcom up to 10 per cent of their annual turnover if they fail to tackle racist, homophobic or other content harmful to children.

The updated rules will also prohibit a platform from removing a user or account unless they have clearly broken the site’s terms of service or the law.

The government ditched the “legal but harmful” requirement because it would have created a “quasi-legal category” that was “confusing”, Culture Secretary Michelle Donelan said.

Writing for The Telegraph, she said: “Some platforms claim they don’t allow anyone under 13 — any parent will tell you that is nonsense. Some platforms claim not to allow children, but simultaneously have adverts targeting children. The legislation now compels companies to be much clearer about how they enforce their own age limits.”

She said many tech companies may be international, but they will have to face the “ramifications” of British law if they fall foul of the new rules.

She told GB News on Tuesday: “We’re certainly not working in isolation. In fact, the rest of the world is watching and waiting for us to do this legislation.

“I’ve spoken to many ministers and counterparts across the globe who have said that they are interested in using our legislation as the blueprint for their own.

“And when it comes to these corporations, yes, they’re international, the vast majority of them have huge footprints in the UK, they hire a lot of people, but they will have to face the ramifications of British law, they will be subject to these fines if they fail the legislation.

“The writing is on the wall now.”

However, Ian Russell, whose daughter Molly killed herself after accessing social media content linked to depression and self-harm, said the plans had been “watered down” and he struggled to understand why.

Julie Bentley, chief executive of Samaritans, described dropping the “legal but harmful” requirement as “a hugely backward step”.

“Of course children should have the strongest protection but the damaging impact that this type of content has doesn’t end on your 18th birthday,” she said.

Shadow Culture Secretary Lucy Powell said it was a “major weakening” of the bill.

“Replacing the prevention of harm with an emphasis on free speech undermines the very purpose of this bill, and will embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online,” she said.

The latest changes come after other updates to the bill, including criminalising the encouragement of self-harm and of “downblousing” and the sharing of pornographic deepfakes.

The government also confirmed further amendments will be tabled shortly aimed at boosting protection for women and girls online.

An addition to the bill will see Ofcom required to consult the Victim’s Commissioner, Domestic Abuse Commissioner and Children’s Commissioner when drafting new codes of conduct for tech firms to follow.

The Online Safety Bill is due to return to Parliament next week after being repeatedly delayed.

Updated: June 20, 2023, 1:29 PM