Power to the parents: How Snapchat leads the way in protecting child users

The social media platform is giving adults the option of seeing who their children are speaking to online

Snapchat has almost 350 million daily users. Reuters
Beta V.1.0 - Powered by automated translation

In 2008, Facebook's slogan was “Facebook helps you connect and share with the people in your life”. This simplicity went to the heart of what the website was offering the world: a revolution in the way we communicate. Behind closed doors, there was another remarkable aspect of the mission – its profitability. In 2008, Facebook's worth was estimated at $15 billion. As of August 2022, the company's market value stood at more than $560bn.

It is true for other sites, too. YouTube now brings in annual revenue of almost $30bn, Instagram $24bn. In 2021, the net worth of Facebook's chief executive, Mark Zuckerberg, was $97bn, and $35bn for TikTok's Zhang Yiming.

With such power and money comes great responsibility. Websites such as Facebook and Twitter have been credited with seismic historic events, from enabling political uprisings to shaping elections. It is impossible to predict the effect of new domains such as the virtual-reality Metaverse, but they could well be epoch-defining.

Keeping pace with these developments on a governmental and personal level matters, because not all of social media's effects have been good. For no group is this more true than children. Companies are very often criticised for not doing enough to protect them.

There has been some progress. YouTube Kids says it "provides a more contained environment for kids to explore". Facebook is developing pop-up warnings to target users who search content "associated with child exploitation".

Snapchat, a video-sharing app that has roughly 350 million daily users, introduced particularly important controls on Tuesday that let parents see who their children are talking to on the platform. The new Family Centre feature will allow them to see accounts teenagers have been in conversation with over the past seven days, although they will not be able to see the content. It is a sensible balance between respecting the privacy of users, while protecting their well-being, too. It puts Snapchat at the forefront of platforms thinking about safety.

Meta has big ambitions in the field of virtual reality. Regulating them will be tough. AFP

It is crucial they and others continue to do so. In the very worst of cases, social media can enable terrible bullying on a mass scale, cause severe mental distress and make it easy for criminals to target children. What is worse, this often happens with little parental and legal oversight. Last year, the Wall Street Journal said it had seen internal research conducted by Facebook, Instagram's owner, that the latter platform had made body issues worse for one in three girls in the UK and US. There are cases where authorities and adults have been denied access to accounts of young people who have died.

It is right companies respect privacy rules, but more flexibility must be built in when it comes to extreme cases. It is unacceptable that, in many ways, the advertisers from whom companies make money know more about the activity of children on social media than their parents.

Thankfully, some companies are taking this imperative seriously. The issue must be a priority. It is a dangerous irony that the demographic most likely to use, understand and circumvent controls on social media, is also the one that has the most to lose in terms of safety and well-being. Parents have a crucial responsibility in protecting them in the virtual world as they do in person, and many have been ready to do so for years. Snapchat is right to do its bit to empower them. Hopefully more firms will follow suit.

Published: August 11, 2022, 3:00 AM