Instagram vows to take down images related to self-harm and suicide

Photo-sharing social media network has come under increasing pressure since a 14-year-old British girl killed herself

After Molly Russell's death, her family found material relating to depression and suicide on her Instagram account
Powered by automated translation

Instagram, the social media network which has come under fire in recent works over the death of a 14-year-old British girl, has vowed to remove all graphic images relating to self-harm and suicide on the app.

Adam Mosseri, Instagram’s head of product, said: “Nothing is more important to me than the safety of the people who use Instagram. “Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe,” a statement by Mr Mosseri said.

“I have a responsibility to get this right. We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”

Molly Russell killed herself in 2017. After her death her family found material relating to depression and suicide on her Instagram account. Her father had previously said Instagram should be held responsible.

Other related content deemed non-graphic, such as healed scars, will not be taken down entirely because Instagram said it did not want to “stigmatise and isolate” those who could be in distress or crying out for help. Such images will, however, not be shown in searches or hashtags, nor will such images be promoted.

Ian Russell, Molly’s father, said: “I welcome the commitment made today by Adam Mosseri to ban all graphic self-harm content from Instagram.

“I also welcome their plans to change their search mechanisms in relation to self-harm and suicide related content and to increase the help and support it provides to its users.”

“It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people,” Mr Russell added.

Instagram has been accused of slow and lacklustre response to the issue.

"It was overwhelming. It's the kind of thing that hits you in the chest and sticks with you," Mr Mosseri told The Telegraph, in reference to the suicide of Molly Russell.

Health secretary Matt Hancock had earlier on Thursday called a meeting with Mr Mosseri and representatives of other social media firms such as Facebook, Snapchat and Twitter, search engine giant Google and the Samaritans charity, that discussed content on suicide and self-harm.

Mr Hancock said that Instagram’s move was “an important change: there’s lots more things that we need to see, like transparency over how much damaging material there is and also I want to see when people search for material about suicide and self-harm, that they are directed better to help, to the Samaritans, to the NSPCC.”

“We’ve seen progress [today], the discussions were productive, and the willingness to try and solve this problem,” Mr Hancock said.

“We’ve got to be led by what the clinicians, what the experts say needs to be taken down and what’s the appropriate way to do that.”

“We’re pushing for a duty of care to the users of social media particularly to children and that duty is something we’re are looking to in a white paper that will be published by the government.”

“I care deeply about getting this right and I feel the concern that any parent feels in this day in age with children using social media and that children are safe,” Mr Hancock added.

The NSPCC, which works to prevent cruelty to children, said the changes were “an important step” but warned that social networks were still not failing in their duty to tackle self-harm.

The charity’s chief executive Peter Wanless said: “It should never have taken the death of Molly Russell for Instagram to act.

“Over the last decade social networks have proven over and over that they won’t do enough to design essential protections into their services against online harms including grooming and abuse,” he said.

“We cannot wait until the next tragedy strikes.”