Government Proposal Threatens Encryption and Anonymity in the UK's Digital Sphere through the Online Safety Bill
The Online Safety Act (OSA), recently enacted as the Online Safety Act 2023, marks a significant step in the UK's efforts to safeguard users, particularly children, from illegal and harmful content online. The Act aims to strike a delicate balance between enforcing systemic accountability, preserving legal free expression, and addressing content moderation concerns.
The OSA does not focus on policing specific pieces of content directly but rather imposes system-level duties on online service providers to assess risks of illegal and harmful content and take appropriate measures to mitigate these risks. This approach seeks to avoid over-censorship while still holding platforms accountable for systemic harms.
Platforms must complete risk assessments for illegal content and children’s safety risks, then implement proportionate measures. This balances users' right to free expression against the need to remove or restrict content that is illegal or highly harmful. However, certain types of content, such as disinformation, are notably outside its direct scope.
The Act applies broadly to user-to-user services (social media, forums, gaming, file sharing, dating sites) and search services accessible to UK users or targeting the UK market, including non-UK providers if they have significant UK user bases. Providers designated as "categorised" may face more stringent monitoring and moderation duties, including for harmful content to children, with clear deadlines like the full implementation of children’s safety codes from July 25, 2025.
Content categories restricted for users under 18 include sexually explicit content, suicide and self-harm encouragement, abusive or hateful content, bullying, serious violence, and content promoting dangerous challenges or substances.
While the OSA emphasizes user safety and content control, the framework operates alongside existing privacy laws, such as GDPR, ensuring that measures to detect and moderate content must still respect privacy rights. Platforms are expected to be transparent about their moderation practices and mechanisms for user redress, indirectly supporting privacy principles through accountability.
The communications regulator Ofcom oversees the OSA, publishing codes of practice, guidance, and enforcing compliance by providers. Implementation is phased, with key milestones in 2025 for risk assessments and deployment of safety duties, emphasizing a risk-based, proportionate approach.
However, concerns have been raised about the potential harm the OSA could cause to innovation, consumer choice, and user welfare in the UK. Without serious amendments, the Act could lead online services to over-moderate lawful speech that they, and their users, would prefer to allow to remain online. This could potentially undermine online encryption and anonymity, discourage free expression, and make UK citizens more vulnerable to bad actors online.
In summary, the Online Safety Act seeks a nuanced balance: it enforces clear duties for online platforms to prevent illegal and child-harmful content, while not directly censoring speech or broadly policing all harmful content, and it requires that privacy safeguards inform moderation policies. This approach reflects the challenge of safeguarding online users without undermining legal freedoms and privacy protections.
- The Online Safety Act, signed as the Online Safety Act 2023, recognizes the significance of data-and-cloud-computing technology in addressing online harm, particularly pertaining to children.
- Amid the focus on AI and innovation, the OSA imposes system-level duties on online service providers, necessitating them to collect and analyze data for risk assessments of illegal and harmful content.
- Although cybersecurity concerns are paramount, the OSA ensures that data collection practices adhere to privacy regulations such as the GDPR, upholding user privacy rights.
- As the OSA evolves, policy-and-legislation and politics will play a crucial role in evaluating its impact on technology, particularly in relation to consumer choice, user welfare, and free expression.
- General news media will highlight discussions around the Act, scrutinizing its effects on the tech industry and the broader societal implications of balancing user safety and online freedoms.
- As companies implement the OSA's measures, it will be essential to monitor their AI systems for potential biases in content moderation and advocating for transparency in technology and its role in protecting users from harmful data.