Australia was the first country in the world to outright ban under-16s from social media. Reuters
Australia was the first country in the world to outright ban under-16s from social media. Reuters
Australia was the first country in the world to outright ban under-16s from social media. Reuters
Australia was the first country in the world to outright ban under-16s from social media. Reuters


The case against banning under-16s from social media


  • English
  • Arabic

January 23, 2026

An unprovoked and horrific stabbing attack that killed a woman on a train. The last breath of a 13-year-old boy shot by a police officer who chased him down. A report on a powerful psychedelic drug that covers its lauded effects, but not the risks of serious psychiatric harm.

These are just a few of the social media videos that stopped my scrolling in its tracks and compelled me to reach for the "report" button. All were rejected and deemed to be within the platform’s community guidelines.

I am not a Luddite, nor am I overly uptight about social media content. I am among the strongest advocates for the power of those platforms. I made it my job to increase news consumption using them, steered by my belief in their ability to create cultural connections, promote diversity, give a voice to the unheard and foster shared understanding between generations.

Social media has revolutionised storytelling, creativity and how we share the depth of humanity. But the beautiful parts of the internet are often unfairly muted by the louder, brasher, dangerous side that can expose society’s most vulnerable and developing minds to the very worst of us all.

Social media has revolutionised storytelling, but has also exposed the most vulnerable in society to the worst of us all. Getty Images
Social media has revolutionised storytelling, but has also exposed the most vulnerable in society to the worst of us all. Getty Images

Children are growing up as witnesses to war crimes, to depravity, subjected to serious and relentless bullying. They are exposed repeatedly to explicit risks to their physical and mental well-being.

We are no longer in the heady early days of connectivity. We have decades of consequences from which to learn. Yet increasingly, the moderation, the self-restraint and the respect for ethical boundaries are slipping away. When did it become acceptable for us to watch, share and like someone’s moment of death? To desensitise ourselves by endless exposure to hate and hurt? To doomscroll to despair?

But despite all this, I am against the idea – and the growing reality – of blocking young teenagers from these platforms altogether.

Australia sets a precedent

Australia’s ban on under-16s holding social media accounts came into effect in December, sparking discussions across governments, as well as health care, media and technology companies worldwide. Everyone is watching closely, trying to understand the potential impact of this world-first legislation and prepare for its ripple effects.

The UK looks set to follow suit, launching a consultation after prominent voices from the country's Labour and Conservative parties called for a ban. France, which already requires parental consent for under-15s, is now considering an outright ban on that age group, as well as a digital curfew for under-18s. Across several EU nations and many individual US states, politicians are looking at age restraints – and many of these proposals have bipartisan support.

Am I a lone voice against the majority? Not quite.

Campaigners for restrictions refute bans

UK opposition leader Kemi Badenoch has announced her Conservative party would ban social media for under-16s. Getty Images
UK opposition leader Kemi Badenoch has announced her Conservative party would ban social media for under-16s. Getty Images

Children’s charities and campaigners in the UK issued a joint statement branding social media bans "the wrong solution". The letter, signed by 42 different online safety institutions, foundations and bereaved families, says “banning children from social media risks an array of unintended consequences”.

The most poignant signatories are the parents and siblings of young people who lost their lives as a result of online harm.

Molly Rose’s father, Kady’s parents, Aimee’s sister – whether you recognise the names or not, the list hits you in the heart. These children were exposed to danger or unimaginable pain in the most seemingly innocuous ways, in front of their families, in plain sight and in the palm of their hands.

Their grieving relatives have spent years fighting for stronger safety measures and for tech companies to be held accountable – and yet those same relatives say a resounding no to the suggestion of an outright ban. They say this “sledgehammer approach” could create a false sense of safety and that the risks would simply migrate elsewhere online, creating a cliff edge into vulnerability at the age of 16.

Meta urges a rethink

Just how many people will this affect? At the time of writing, almost five million accounts were removed from platforms impacted by the Australian ban – Facebook, Threads, Instagram, X, Reddit, TikTok, Snapchat, YouTube, Twitch and Kick.

After announcing that it deleted more than half a million accounts in a single week, Meta, the parent company of Instagram, Threads and Facebook, called on the Australian authorities to “find a better way forward”.

Almost five million accounts were removed from social media platforms under the Australian ban. Getty Images
Almost five million accounts were removed from social media platforms under the Australian ban. Getty Images

Its suggestion? “Incentivising all of industry to raise the standard in providing safe, privacy-preserving, age-appropriate experiences online instead of blanket bans.”

This is the same company that, almost exactly a year earlier, announced it was ending its professional third-party fact-checking system in the US and moving to community moderation, like that used on Elon Musk’s X. Meta also loosened some of its moderation policies in the name of free speech.

These moves prompted its own Oversight Board to issue a statement warning that the changes could allow harmful content to proliferate, calling on the company to identify and address “adverse impacts on human rights that may result from them”.

Is that the pot calling the kettle black? Or an admission that self-governance and public moderation have failed? Probably both.

Mass reach is an unregulated free-for-all

Before I pivoted to digital news production and audience strategy, I had a decade of making news for TV. In those days of making content on tape, the fundamentals were always clear. The audience is front and centre of everything, and the people whose stories and lives are put on show should have every justification for being there. Back in those days, few people had the power of mass reach and with it came a tremendous responsibility. From swearing on screen to showing a needle piercing skin, such relatively tame moments were either completely forbidden or carefully considered to protect children and those vulnerable to triggers.

Before we could all simply broadcast from our phones, the few who could reach hundreds of thousands were extensively trained and well-versed in regulations and ethical industry standards. And they never did it alone – there were layers of checks and oversight.

Now, it’s a free-for-all, and we are all suffering.

Bring back the professionals

But that’s not quite the full picture, is it? There are plenty of brands still following strict checks and balances, still running content through several experienced professionals and lawyers before hitting send, but these brands are being pushed aside while creators are amplified. Reach is optimised for engagement, not responsibility.

We now have a reverse of the traditional broadcast model. So here is my pitch for its modern revival:

  • Train and certify high-reach creators

Reach is limited to a creator's own followers aged over 18, unless they are certified in the relevant media law, child protection and platform guidelines

  • Digital ‘watershed’ spaces

Kids Netflix, Kids YouTube and Instagram Teen accounts – increase and strengthen these access models that help parents create balanced, safe spaces on popular sites, while allowing a level of autonomy and privacy

  • User control of algorithms

Expand control of discovery algorithms, time caps and filters – and teach parents how to shape their child’s feeds to their individual needs and triggers. Under UAE law, parents have a legal obligation to monitor their children's digital use

  • International cross-platform standards

A global framework agreed on by governments and all platforms to guide legal and ethical standards, and a consensus on the consequences

  • Protect space for industry-regulated publishers

Whether it is news, health authorities or education institutions, make space again for legacy brands that adhere to codes of practice and create content in the public interest – and let them link out for context

  • Professional oversight and age verification

Mandate accurate, privacy-preserving age verification and reinstate fact-checkers and third-party moderation systems where evidence shows they reduce harm

Safer for children is better for everyone

Protecting children and vulnerable users in these ways would have a ripple effect on all of society. It would raise standards for all. It would improve mental health and relations, reduce mindless, addictive scrolling, and maintain cross-generational spaces that parents know how to control.

MySpace came to prominence at a time when we did not fully understand digital dangers. Bloomberg
MySpace came to prominence at a time when we did not fully understand digital dangers. Bloomberg

I am old enough to remember AOL chat rooms, MySpace and the early days of Facebook, when we genuinely did not yet understand the immediate dangers and long-term social consequences of these new digital spheres. That was 20 or 30 years ago.

We do not have that excuse any more.

Banning young people from social media does not fix the systems that cause harm. It simply delays their exposure to environments that we have collectively failed to make safe. The platforms are an undeniable part of modern life; they have been enduringly popular for a reason. But the difficult task now is finding a meaningful and enforceable way forward rather than blindly following an engagement metric to keep thumbs scrolling.

Updated: January 23, 2026, 6:00 PM