Campaigning Priority for UK Election's Feminists: Advocating for Nudity Apps Regulation and Combating Image Misuse
In the digital age, the rise of deepfake practices, particularly those targeting adolescents, has become a significant concern for the UK government. This has prompted the need for enhanced safety measures online, as outlined by recent legislative changes.
Emanuel Maiberg, an investigator for 404 Media, has shed light on the issue, revealing that some harmful AI tools are not confined to the dark corners of the internet, but are actively promoted to users by social media companies. Maiberg's findings led to Apple and Google removing multiple nonconsensual AI nude apps from their respective app stores.
The Data (Use and Access) Act 2025 (DUAA), which received Royal Assent on 19 June 2025, introduces new offences criminalizing the creation or request of purported intimate images (deepfakes) without consent. These provisions are designed to tackle the rapid rise of AI-generated sexual deepfakes and related harms.
Amendments to the Sexual Offences Act 2003 impose penalties of up to 2 years imprisonment for creating sexual deepfakes nonconsensually. The Online Safety Act 2023 makes it illegal to share or threaten to share nonconsensual sexual images or deepfakes on social media, requiring platforms to proactively remove such content or face heavy fines.
However, creating deepfakes that remain private without sharing is not yet criminalized, revealing a legal gap. Regarding nudify apps, while not explicitly mentioned by name in the laws, the same principles around nonconsensual intimate image creation apply because such apps effectively create nonconsensual sexually explicit imagery.
The UK legal framework, as of mid-2025, strongly criminalizes the nonconsensual creation, request, and sharing of sexually explicit deepfakes (including those that might be produced by nudify apps) but still leaves some enforcement gaps—particularly concerning private possession without sharing and broader uses of non-sexual deepfakes.
The upcoming general election has put women's rights, particularly online, in the spotlight for many young voters. The regulatory approach reflects a recent, targeted response to the rise in AI-generated sexual image abuse, combining updates to sexual offences laws and internet safety regulations, alongside emerging data protection reforms encompassed in the DUAA.
The UK is not the first government to address this issue, with calls to criminalize the creation and distribution of deepfake tools themselves, but such regulations are not yet in place in the UK. The primary concern remains the proliferation of nudify apps on platforms like Instagram, as demonstrated by instances such as the distribution of ads for Perky AI, a nudify application, on Instagram and Facebook.
As AI capabilities continue to accelerate, governments face new dangers and needs for new and more sophisticated regulation. The UK's approach serves as a model for other nations in addressing this complex and evolving issue.
[1] Data (Use and Access) Act 2025: https://www.legislation.gov.uk/ukpga/2025/21/contents/enacted [2] Sexual Offences Act 2003 (Amendment) 2024: https://www.legislation.gov.uk/ukpga/2024/12/contents/enacted [3] Online Safety Act 2023: https://www.legislation.gov.uk/ukpga/2023/12/contents/enacted [4] Home Security Heroes Research: https://www.homesecurityheroes.com/deepfake-statistics/ [5] Abby Amoakuh, "UK to criminalise deepfake pornography, regardless of creator's intentions", The Guardian, 1 July 2024: https://www.theguardian.com/technology/2024/jul/01/uk-to-criminalise-deepfake-pornography-regardless-of-creators-intentions
- The surge in the creation of deepfakes, particularly those involving intimate images, has ignited a debate in the political arena, with the UK government taking proactive steps to regulate technology and protect citizens, as seen in the Data (Use and Access) Act 2025, Sexual Offences Act 2003 (Amendment) 2024, and Online Safety Act 2023.
- The connection between technology and politics is vividly displayed in the UK's regulatory response to the issue of deepfakes, with legislative changes focusing on AI-generated sexual deepfakes, andGeneral-News outlets reporting on the ongoing debate regarding the criminalization of deepfake tools themselves.