LONDON: As the Online Safety Act (OSA) begins taking effect in the UK, online platforms have been asked to assess whether their services expose users to illegal material by March 16, 2025, or face penalties.
Ofcom, the regulator enforcing internet safety law in the country published its final codes of practice for how firms should deal with illegal online content on Monday.
Platforms have three months to carry out risk assessments to identify potential harms on their services. Failure to do so could result in fines of up to 10% of their global turnover.
Ofcom head Dame Melanie Dawes told BBC News this was the “last chance” for social media to make changes.
“If they don’t start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous,” she said.
“I’m asking the industry now to get moving, and if they don’t they will be hearing from us with enforcement action from March.”
The Molly Rose Foundation said the OSA has “deep structural issues”. Andy Burrows, its chief executive, said the organisation was “astonished and disappointed” by a lack of specific, targeted measures for platforms on dealing with suicide and self-harm material in Ofcom’s guidance.
“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life,” he said.
And children’s charity the NSPCC has also voiced its concerns.
The OSA became law in October 2023, following years of wrangling by politicians and campaigning by people concerned over the impact of social media on young people.
The new measures include child safety features, such as preventing social media platforms from suggesting that people befriend children’s accounts and providing warnings about the risks of sharing personal information.
Several major tech companies have already implemented safety measures for teenage users and introduced controls that allow parents to monitor their children’s social media activity to address risks for young users and to pre-empt potential regulations.
For example, on platforms like Facebook, Instagram, and Snapchat, users under 18 cannot be found in searches or receive messages from accounts they do not follow.
Technology Secretary Peter Kyle described the publication of Ofcom’s new codes as a “significant step” toward the government’s goal of making the internet safer for people in the UK.
“These laws mark a fundamental reset in society’s expectations of technology companies,” he said.