Social media sites not properly blocking self-harm content

UK charity the Samaritans urges platforms to do more to protect users rather than wait to be regulated

The Samaritans are pushing for an online safety bill to become law that would reduce access to all harmful content. EPA
Powered by automated translation

Social media sites are not properly preventing vulnerable users, particularly young people, from seeing self-harm content that also makes them hurt themselves more seriously, a UK study has found.

The research, carried out by the Samaritans charity and Swansea University, found that 83 per cent of those asked had been recommended self-harm websites without searching for them.

And 76 per cent of those who had seen self-harm or suicide content said they went on to hurt themselves more severely because of it.

“We would never stand for people pushing this kind of material uninvited through our letterbox, so why should we accept it happening online,” Samaritans' chief executive Julie Bentley said.

“Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously.

“People are not in control of what they want to see because sites aren't making changes to stop this content being pushed to them and that is dangerous.

“Sites need to put in more controls, as well as better signposting and improved age restrictions.

The Samaritans are pushing for an online safety bill to become law that would reduce access to harmful content across all sites and make sure that it is tackled for both children and adults.

“The internet moves much quicker than any legislation so platforms shouldn't wait for this to become law before making vital changes that could save lives,” Ms Bentley said.

An open letter from 14 healthcare charities to the government in October called for quick action on the bill.

“While the internet can be an invaluable resource for individuals experiencing feelings of self-harm and suicide, online content can also act to encourage, maintain or exacerbate self-harm and suicidal behaviours,” it said.

“Although suicide and self-harm are complex and rarely caused by one thing, in many cases the internet is involved.”

The study found that three quarters of those who took part had seen self-harm content online for the first time aged 14 or younger.

The vast majority of those asked — 88 per cent - said they wanted more control over filtering the content they see on social media, and 83 per cent believed that more specific warnings, such as using terms like self-harm or suicide, would be helpful.

The charity urged platforms to do more now to protect users rather than waiting for regulation to be forced upon them.

Prof Ann John from Swansea University, co-lead on the study, said more research on the subject was needed but it was clearly damaging to many people.

“While our study cannot claim to represent the whole population's experience of this content since only those interested would have responded to our requests, many of the themes point clearly to ways social media platforms can improve,” she said.

“People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies. That all seems very doable.”

Instagram and Pinterest

In October, an inquest into the November 2017 death of Molly Russell found social media content contributed “more than minimally” to her death.

Her father Ian has called for urgent changes to make children safer online.

“It's time to protect our innocent young people instead of allowing [social media] platforms to prioritise their profits by monetising the misery of children,” Mr Russell said.

Instagram and Pinterest used algorithms that resulted in Molly having “binge periods” of material, some of which was selected and provided for Molly without her having requested it.

Coroner Andrew Walker told North London Coroner's Court: “She died from an act of self-harm while suffering from depression and the negative effects of online content.”

In the year ended July 2021, Childline delivered more than 73,000 counselling sessions about mental health, and 24,200 counselling sessions about suicidal thoughts or feelings.

Researchers said the Swansea study used a social media campaign to encourage people to take an online survey on the issue. They said this may have affected the outcome as it was possible that more people with experience of self-harm and suicide would have chosen to take part.

They said the study still highlighted how damaging such content can be, particularly to vulnerable young people.

Updated: November 08, 2022, 1:33 PM