eSafety Forces Major ‘Nudify’ Provider Out of Australia as Crackdown on AI-Generated Child Exploitation Expands

Written by Jeremy Werner

Jeremy is an experienced journalist, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 12/10/2025
In News

Australia’s eSafety Commissioner has confirmed that a major UK-based provider of three globally popular “nudify” services has withdrawn access for Australian users after enforcement action over the creation of AI-generated child sexual exploitation material (CSEM).

 

The services, which received roughly 100,000 monthly visits from Australians, have been linked to high-profile incidents involving AI-generated sexual exploitation of school students. In September, eSafety issued the company a formal warning for breaching Australia’s mandatory online safety Codes and Standards by allowing its tools to be used to “undress” minors digitally.

 

eSafety Commissioner Julie Inman Grant said the decision demonstrates that Australia’s world-leading regulatory framework is having real impact. “We know ‘nudify’ services have been used to devastating effect in Australian schools and with this major provider blocking their use by Australians we believe it will have a tangible impact on the number of Australian school children falling victim to AI-generated child sexual exploitation,” she said.

 

The commissioner noted the provider had marketed harmful features, including options to undress “any girl,” generate “schoolgirl” images, and toggle a “sex mode.”

 

The action comes as major AI hosting platform Hugging Face moves to comply with Australian law following warnings that generative AI models on its site were being misused by Australians to produce CSEM. Hosting platforms function as critical distribution hubs for powerful AI models, making safeguards essential, eSafety said.

 

Hugging Face has now updated its terms of service to require all account holders to mitigate risks that their uploaded models could be used to generate child sexual exploitation or pro-terror material. If users violate the terms, the platform must take enforcement action—or face potential penalties under the Online Safety Act, including fines up to $49.5 million.

 

“By targeting both the consumer tools, the underlying models that power them, and the platforms that host them, we’re tackling harm at multiple levels of the technology stack,” Inman Grant said, adding that further government reforms to restrict access to nudify tools are underway.

 

Need Help?

 

If you’re wondering how Australia’s AI policy, or any other government’s policy, bill or regulations could impact you, don’t hesitate to reach out to BABL AI. Their Audit Experts are ready to provide valuable assistance while answering your questions and concerns.

 

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter