Social Media: Ban or Regulate?
09 March 2026
“The circumstances of Molly’s death are far from unique. Technology plays a role in almost one-quarter (24%) of deaths by suicide among young people aged 10 to 19, equivalent to a lost young life every week.” ‘Pervasive by Design’: The Molly Rose Foundation.
Last Thursday Channel 4 screened the documentary film ‘Molly Vs The Machines’, the tragic story of Molly Russell who took her own life in 2017 aged 14. When the Coroner’s Court came to hear the case in 2022 (5 years of obfuscation from the platforms) the conclusion reached was: “Molly died from an act of self-harm while suffering from depression and the negative effects of online content.”
Here’s a Cog Blog post from the time of that 2022 verdict.
The film is shocking, of course. Molly’s dad, Ian Russell’s composure, his determination to make sure Molly is not forgotten, his polite bloody-mindedness in the face of corporate non-answers not to mention appalling tragedy, are remarkable.
Contributions from various META executives are dreadful. Not just in allowing the content Molly saw to be there in the first place but in their subsequent justifications. The evidence given by their Head of Health and Wellbeing Policy, Elizabeth Lagone to the inquest is chilling, describing the content seen by Molly as ‘safe’ because it adhered to content guidelines in place at the time.
Ms Lagone subsequently apologised. She has since left META, like others featured: Nick Clegg, Sheryl Sandberg, Steve Hatch and the whistleblower Arturo Bejar.
The Molly film should be an inflection point, informing the debate around how or if we should regulate / control social media.
One suggestion is to ban access to under 16s. This is the expedient solution as far as politicians are concerned. It is a knee-jerk reaction (‘look, we’ve done something’). It’s easily understood.
I don’t believe this will work.
A ban creates a barrier between parent and child. If the child is doing something literally illegal (because of course young people will find a way of accessing social media, ban or not) they are less likely to open up to their parents.
Second, it lets the platforms off the hook. If under 16s aren’t allowed access, it’s easy to ignore the underlying issue. Youngsters exposed to harmful material? Not our problem, they’re on our site illegally.
The answer is to strengthen regulation. We have OFCOM and the Online Safety Act. Neither works well, but it must be easier to strengthen what we have than to start down a different path.
Communication channels are full of advice on what to do if you wish to complain or take issue with something in print or broadcast.
The social media channels should be pressured to include a facility for anyone exposed to unacceptable content to complain about it, anonymously if preferred, to an independent authority with the teeth to fine them (or worse).
Use users to police the system, to alert the regulator to misdemeanours.
There will be howls about this. It’s impractical. It’s open to abuse. There are too many users. Complaints will need checking. It will be too expensive.
I don’t agree. Consider these business’ vast profits, and the lack of tax paid. Why not tax them, or fine them to fund such a service?
Then there’s the ad industry. Although META is c.95% funded by advertising, bringing meaningful pressure from such a vast advertiser base is always said to be impossible.
That’s a conclusion based on too literal an assessment.
The only way to hold the social media channels to account is via Government. Large advertisers may be of minor significance in revenue terms, but they are significant as influencers. They have access to policy makers.
One expert group who should be leading the charge to improve social media, our largest marketing services organisations, are compromised by their relationships with the platforms. There are a few brave individuals but corporately they’re unlikely to speak out given their dependence on the platforms.
CMOs will stay quiet too. Decisions on where to spend their companies’ money over multiple years could justifiably be questioned – not just morally, not just in terms of supporting criminal ad fraud, but in terms of reducing their brands’ values.
There’s an irony in quietly continuing to wave through money spent on channels that cause young people harm, whilst earnestly debating whether advertising around news is somehow dangerous.
I may be over-optimistic but my sense is that the dirty water is circling the drain. The criticism of the social media channels is becoming louder; the disconnect between large companies’ ethical and moral values and the places where they choose to spend their money is becoming deeper and more apparent.
There used to be a thing in media selection known as Chairman’s Wife Syndrome. As in, I don’t care what the numbers say, the Chairman’s wife reads ‘The Daily Mail’; make sure we’re in it.
We are now approaching reverse Chairman’s Wife Syndrome. The Molly Russell legacy.

Totally agree Brian – the platforms make BILLIONS so need to be held accountable for the content served on their plaforms. A ban is token and kids will always find a way round it. Everyone is so scared of the power of the platforms – ref the tax scenario. Perhaps “Advertising who cares” can help galvanise further action?