New Regulations for Safeguarding Children and Internet Users in the UK Explained
The UK's Online Safety Bill, also known as the Online Safety Act, is a new piece of legislation that sets out specific requirements and measures for social media and online platforms to protect both children and adults from harmful content. The law, which is set to come into effect in July 2025, places a strong emphasis on age verification, content risk assessments, and proactive content moderation.
Key Requirements and Measures for Protecting Children and Adults
- Age Verification Systems Platforms must implement robust age checks to prevent children from accessing harmful content. Methods for age verification include credit card verification, biometric verification via selfie uploads analyzed for age estimation, and government ID uploads (e.g., driver’s license, passports). Companies such as Bluesky, Reddit, and major adult content providers have already adopted these methods, often using third-party age verification specialists such as Persona or Epic Games’ Kids Web Services.
- Filtering and Access Controls For users under 18 or those who do not verify their age, platforms must restrict access to adult-appropriate content and disable certain features, such as direct messaging, where appropriate.
- Children’s Risk Assessments Online services are required to conduct and submit children’s risk assessments to the UK regulator Ofcom. These assessments must identify and address risks from harmful content on their platforms, guiding how to moderate or block such content effectively.
- Prohibition of Exposure to Specific Harmful Content Providers are prohibited from exposing children to content promoting self-harm, eating disorders, extreme violence, and similar harmful material. This is a legally binding obligation requiring proactive moderation and content control measures.
- Mandatory Enforcement and Compliance Monitoring Ofcom enforces these obligations via an enforcement program monitoring compliance with age verification and content safety duties. Failure to comply can result in enforcement actions, including fines or other penalties.
- Applicability to UK-Connected Platforms The law applies not only to obvious adult content sites but also to social media and search platforms hosting some adult or risky content, requiring those platforms also to apply these safety measures for UK users.
Challenges and User Response
- Some users attempt to bypass age verification and restrictions using VPNs, modified browsers, or third-party clients, highlighting ongoing enforcement challenges.
- The Wikimedia Foundation has legally challenged some regulations related to stringent identity verification obligations (Category 1 duties) citing privacy and practical concerns for volunteer-based platforms, illustrating complexity in balancing safety with privacy and platform-specific needs.
Summary Table of Measures
| Measure | Description | Platforms & Examples | |--------------------------------|-------------------------------------------------------------------------------------------------|--------------------------------| | Age Verification | Credit card, biometric selfie, government ID uploads to confirm users are 18+ | Bluesky, Reddit, Pornhub | | Access Restrictions | Blocking adult or harmful content and disabling features for unverified or underage users | Bluesky (messaging disabled) | | Children’s Risk Assessments | Platforms must document risks and report to Ofcom | Mandatory for many platforms | | Prohibition of Harmful Content | Banning exposure of children to self-harm, eating disorders, extreme violence | Enforced by Ofcom | | Enforcement and Penalties | Ofcom’s monitoring program and legal enforcement against non-compliant platforms | Ongoing since July 2025 |
The UK Online Safety Bill mandates technology companies to embed age-appropriate safety measures, conduct thorough risk assessments, and actively block or remove harmful content, ensuring greater protections for children and adults online while also facing practical and privacy challenges in implementation.
- In adherence to the UK's Online Safety Bill, cybersecurity measures will be enhanced for technology companies, particularly in the implementation of age verification systems, such as credit card, biometric selfie, and government ID verification, to safeguard children and adults from accessing harmful content.
- These technology companies, under the Online Safety Bill, will also have to abide by the mandate for proactive content moderation, where they will be required to filter and control access to adult-appropriate content for users who are not verified or are below the age of 18, and disable certain features for such users, like direct messaging.