Ofcom has warned social media sites they could be banned for under-18s if they fail to comply with new online safety rules.
The media regulator has published draft codes of practice which require tech firms to have more robust age-checking measures, and to reformulate their algorithms to steer children away from material they should not see.
But parents of children who died after exposure to harmful online content have described Ofcom’s new rules as “insufficient” – one told the BBC change was happening “at a snail’s pace.”
In statements, Meta and Snapchat said they had extra protections for under-18s, and offered parental tools to control what children can see on their platforms.
Other firms have not responded to a BBC request for comment.
Ofcom boss Dame Melanie Dawes said any company that broke the draft codes of practice would be “named and shamed”, and she made clear tougher action such as banning social media sites for children would also be considered.
Speaking to BBC Breakfast, Esther Ghey – whose daughter Brianna was murdered, aged 16, by two teenagers in February 2023 – said she believed Ofcom “really did care” about trying to get regulation right.
But she said the full extent of the problem remained unknown.
âI wonder how many children are actually struggling with their mental health, how many of them have been affected by self-harm that we donât actually know about,â she said.
Lisa Kenevan, whose son Isaac died aged 13 after taking part in a “black out” challenge online, said the pace of change was not fast enough.
“The sad thing is the snail’s pace that is happening with Ofcom and social media platforms taking responsibility, the reality is there’s going to be more cases,” she told BBC Breakfast.
It is Ofcom’s job to enforce new, stricter rules following the introduction of the Online Safety Act – these codes set out what tech firms must do to comply with that law.
Ofcom says they contain more than 40 “practical measures.”
The centrepiece is the requirement around algorithms, which are used to decide what is shown in people’s social media feeds.
Ofcom says tech firms will need to configure their algorithms to filter out the most harmful content from childrenâs feeds, and reduce the visibility and prominence of other harmful content.
Other proposed measures include forcing companies to perform more rigorous age checks if they show harmful content, and making them implement stronger content moderation, including a so-called “safe search” function on search engines.
Speaking to BBC Radio 4’s Today programme, Dame Melanie described the new rules as “a big moment”.
“Young people are fed harmful content on their feed again and again and this has become normalised but it needs to change,â she said.
According to Ofcom’s timeline, these new measures will come into force in the second half of 2025.
Dame Melanie added: “We will be publishing league tables so that the public know which companies are implementing the changes and which ones are not.â
Dame Melanie met Ms Ghey and Ian Russell, whose daughter Molly took her own life in 2017 at the age of 14.
In 2022, a coroner concluded she died from an act of self-harm while suffering depression and the negative effects of online content.
They are part of a group of bereaved parents who have signed an open letter to Prime Minister Rishi Sunak and leader of the opposition Sir Keir Starmer.
In it, they implore the politicians to do more for the online safety of children – including making “a commitment to strengthen the Online Safety Act in the first half of the next parliament.”
They also ask for mental health and suicide prevention into the school curriculum.
“While we will study Ofcomâs latest proposals carefully, we have so far been disappointed by their lack of ambition,” they add in the letter.
‘Step up’
The government insists the measures announced by Ofcom “will bring in a fundamental change in how children in the UK experience the online world.”
The Technology Secretary Michelle Donelan urged big tech to take the codes seriously:
“To platforms, my message is engage with us and prepare,” she said.
“Do not wait for enforcement and hefty fines â step up to meet your responsibilities and act now.”
Most of the tech companies contacted by the BBC did not reply or declined to comment on the record.
A Snapchat spokesperson said: âAs a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience,” said
“We support the aims of the Online Safety Act and work with experts to inform our approach to safety on Snapchat.â
And a Meta spokesperson said the firm wanted young people “to connect with others in an environment where they feel safe”.
“Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules and we remove that content when we find it,” they said.
Source: bbc.co.uk
Be the first to comment