British Online Safety Act Takes Effect
The first set of new UK online safety rules legally requiring social media and other sites to take action against illegal content that have been published by Ofcom, the regulator enforcing the UK's internet safety law. That means that online platforms operating in the UK must review whether their services expose users to illegal material by 16th March 2025 or face financial punishments as the UK’s Online Safety Act (OSA) begins taking effect.
The regulator has said platforms now have three months to assess the risk of their users encountering illegal content and implement safety measures to mitigate those risks, or face enforcement action if they fail to comply with their new duties once they come into force.
Failure to comply could result in a financial penalty as much as up to 10% of global turnover.
Some of the child safety features required by Ofcom's codes include ensuring that social media platforms stop suggesting people befriend children's accounts, as well as warning them about risks of sharing personal information.
Certain platforms must also use a technology called hash-matching to detect child sexual abuse material (CSAM), a requirement that now applies to smaller file hosting and storage sites. Hash matching is where media is given a unique digital signature which can be checked against hashes belonging to known content - in this case, databases of known CSAM.
Many large technology firms have already brought in safety measures for teenage users and have given control to parents concerning their social media use in a bid to tackle dangers for teens and pre-empt regulations. Right now, Facebook, Instagram and Snapchat users under the age of 18 cannot be discovered in search or messaged by accounts they do not follow.
In October 2024, Instagram also began blocking some screenshots in messages to try and combat sextortion attempts, which experts have warned are on the rise, often targeting young men.
Concerns have been raised throughout the OSA's journey over its rules applying to a huge number of varied online services, with campaigners also frequently warning about the privacy implications of platform age verification requirements. Parents of children who died after exposure to illegal or harmful content have criticised Ofcom for not doing more.
Social media platforms are being told they must have measures in place to prevent users from accessing outlawed material by 16th March.
However, this is in contrast to the relaxation of content moderation on Meta's platforms in the US - Facebook, Instagram - along with Elon Musk's X, announced only a few days before President-elect Trump's inauguration on 17th January.
Ofcom | Gov.UK | BBC | Irish News | Guardian | GNC |
Image: Beytullah ÇİTLİK
You Might Also Read:
Teach Your Children About Safer Cyber Security:
If you like this website and use the comprehensive 7,000-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible