Taming Aggressive Algorithms
Ofcom recently published more than 40 steps for tech firms to take in order to protect children online as part of its draft Children's Safety Codes of Practice.
The draft Codes come in response to the UK Online Safety Act 2023 which holds tech platforms legally responsible for keeping users safe online, particularly children. Under the Act, social media apps, search and other online services have new duties to assess the risk of harm to children and take steps to address it. Ofcom’s measures cover how services should comply with these new duties.
In response to the publication of Ofcom’s new measures, Technology Secretary Michelle Donelan reiterated the Government’s goal: “when we passed the Online Safety Act last year, we went further than almost any other country in our bid to make the UK the safest place to be a child online.”
Ofcom is not the only regulator taking steps to ensure better protection for children online; earlier this month, Ofcom teamed up with the Information Commissioner’s Office (ICO) to publish a joint statement, committing to collaborating on the regulation of online services. The ICO also recently published its Children’s code strategy for 2024-2025. What are the recommended steps for tech firms to take under Ofcom’s new Codes?
Ofcom’s recommended measures for social media, search and other online services include:
Age checks: Ofcom expects greater use of “highly-effective age-assurance”. In practice, this means using photo-ID matching, facial age estimation or reusable digital identity services to verify a user’s age. Ofcom is clear that payment methods which do not require the user to be over 18, self declaration of age and general contractual restrictions do not go far enough. In certain cases, tech platforms may need to prevent children from accessing the site entirely.
Safer and more controlled algorithms: Ofcom describes algorithms providing personalised recommendations to users as “children’s main pathway to harm online”. The draft Codes propose that tech firms alter their algorithms to filter out the most harmful content such as content relating to suicide, self-harm, eating disorders, and pornography from children’s feeds, and also reduce the visibility of other harmful content including violent hateful or abusive material, online bullying and content promoting dangerous challenges.
More effective content moderation: User-to-user services such as social media apps must ensure that swift action is taken against content harmful to children as part of their content moderation systems. Search engines must take similar steps and where a user is believed to be a child, large search engines must implement a safe search setting which removes the most harmful content. Content moderation teams are required to be well-resourced and trained.
Policies: Services must introduce clear policies on what type of content is allowed and how it is prioritised for review.
Strong governance and accountability: This includes services having a named person accountable for compliance with children’s safety requirements, an annual senior-body review of all risk management activities in relation to children’s safety and an employee Code of Conduct which sets out children’s safety standards to abide by.
Greater choice and support for children: Children must be able to provide negative feedback in response to recommended content so that they have control over what they do not want to see. Support tools should also be provided to enable children to have more control over their use of online services, such as options to disable comments on their posts and to block user accounts.
Consequences of Non-compliance
Ofcom Chief Executive Dame Melanie Dawes has said that once the measures are in force, “we won’t hesitate to use our full range of enforcement powers to hold platforms to account.” Where tech firms’ duties to protect children online are not performed, Ofcom will have the ability to take enforcement action including issuing a penalty of up to 10% of qualifying worldwide revenue or £18 million (whichever is greater) and requiring remedial action to be taken.
In addition to Ofcom’s statutory powers, the regulator may use alternative compliance tools which include:
- Sending a warning letter.
- Undertaking compliance remediation – consisting of a period of engagement which gives the tech firm the opportunity to address or remedy any compliance concerns.
- Opening an enforcement programme – to understand whether there is an industry-wide issue causing harm to underaged users and to determine an appropriate response.
Next Steps For Tech Firms
The Codes form part of a consultation which will run until 17 July 2024. According to Ofcom’s roadmap on implementing the Online Safety Act, Ofcom will finalise the Codes and submit them to the Secretary of State for approval in the first half of 2025.
Once the Codes come into force, tech firms must comply with their children’s safety duties and Ofcom can enforce against non-compliance.
Whilst this is over a year away, the message from Technology Secretary Michelle Donelan is clear: “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines - step up to meet your responsibilities and act now.”
David Varney is Partner at independent UK law firm Burges Salmon
Image: Unsplash
You Might Also Read:
Cyber Security Education From Childhood Is Becoming Vital:
___________________________________________________________________________________________
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible