Tech firms told to hide ‘toxic’ content from children
Ofcom has warned social media sites they could be named and shamed – and banned for under-18s – if they fail to comply with new online safety rules.
The media regulator has published draft codes of practice which require tech firms to have more robust age-checking measures, and to reformulate their algorithms to steer children away from what it called “toxic” material.
But parents of children who died after exposure to harmful online content have described the proposed new rules as “insufficient” – one told the BBC change was happening “at a snail’s pace.”
In statements, Meta and Snapchat said they had extra protections for under-18s, and offered parental tools to control what children can see on their platforms.
It is Ofcom’s job to enforce new, stricter rules following the introduction of the Online Safety Act – these codes set out what tech firms must do to comply with that law.
Ofcom says they contain more than 40 “practical measures.”
The centrepiece is the requirement around algorithms, which are used to decide what is shown in people’s social media feeds.
Ofcom says tech firms will need to configure their algorithms to filter out the most harmful content from children’s feeds, and reduce the visibility and prominence of other harmful content.
Other proposed measures include forcing companies to perform more rigorous age checks if they show harmful content, and making them implement stronger content moderation, including a so-called “safe search” function on search engines that restricts inappropriate material.