Online safety laws enforced

Ofcom enforces Online Safety Act, requiring tech firms to remove illegal content or face heavy penalties
From today, Monday 17 March, Ofcom begins enforcing the , requiring social media companies to identify and remove illegal content such as child sexual abuse material.
Mark Jones, Partner at Payne Hicks Beach, said today is the deadline for tech companies to complete their illegal harms risk assessments, ensuring they assess the risks illegal harms pose to users of their service. He noted that Ofcom has provided guidance, including risk profiles, to aid assessments but warned that completing them is not enough. Tech firms must also outline how they will tackle illegal harms and proactively remove such content.
Jones highlighted that non-compliance could result in Ofcom imposing fines of up to £18 million or 10% of a company’s worldwide revenue, whichever is greater. In serious cases, Ofcom can even apply for a court order to block a site in the UK. He added that Ofcom has indicated it will swiftly act against non-compliant firms, with enforcement depending on the severity of any breaches.
The Online Safety Act marks a shift from a reactive to a proactive approach in content moderation, requiring tech firms to demonstrate accountability. Jones explained that compliance measures must be proportionate to each company, considering service type, user base, and risk assessment findings. However, all services must name an individual responsible for online safety and ensure transparency in their terms of service.
Protecting children is a key focus, with platforms required to prevent visibility of children's profiles, locations, and connections to unverified users. Additionally, non-connected users should not be able to send direct messages to minors. On safeguarding women and girls, platforms must allow users to block and mute harassers, while sites must remove non-consensual intimate images as soon as they become aware of them.
Jones further stressed that platforms should have systems to identify and assess potentially illegal content, enabling swift removal when necessary.
Tech firms to tackle illegal content – expert sees no engagement
Technology firms must act against illegal content under the new rules, but concerns remain about weak enforcement.
Iona Silverman, an Intellectual Property & Media expert at Freeths in London, said that while Ofcom is the regulator, the primary responsibility for compliance rests with online platforms. She emphasised that companies must prevent children from accessing harmful content and improve transparency and control for users. She noted that Ofcom’s ability to impose fines of up to £18 million or 10% of global turnover should have driven social media companies into action, but there is no evidence of any meaningful engagement or compliance efforts.
Instead, she pointed out that Meta announced in January it was removing third-party fact-checking in favour of a community notes model. She cited Mark Zuckerberg’s admission that changes to Meta’s content filtering would mean “we’re going to catch less bad stuff” as evidence of a concerning trend. Meta justified the changes in the name of free speech, but Silverman dismissed the argument, saying JD Vance’s claim that free speech in the UK is under threat is based on a personal political agenda.
She backed the UK government’s position that the Online Safety Act is about tackling criminality rather than censoring debate. Given online platforms' past behaviour, she urged Ofcom to take a tough stance by critically reviewing content and issuing substantial fines to platforms that fail to take the necessary steps to keep people safe online.