UK’s ICO Issues Notice Against Snapchat’s AI Chatbot

Written by Jeremy Werner

Jeremy is an experienced journalist, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 10/06/2023
In News

UPDATE – JANUARY 2026:

In July 2025, the UK Information Commissioner’s Office (ICO) issued its final decision on Snapchat’s “My AI” chatbot. The ICO confirmed that Snap had submitted a compliant Data Protection Impact Assessment. The ICO also confirmed Snap implemented appropriate mitigations, including new safeguards designed specifically for teen users.

As a result, no enforcement action will be taken. However, the regulator stressed that the case represents a “precedent-setting warning shot” to the wider industry: any generative AI product launched without robust, child-focused risk assessments can expect heightened scrutiny. Although the investigation is formally closed, Snap will remain under ongoing monitoring through 2026 and must provide periodic compliance reports to demonstrate continued adherence to UK data protection law.


ORIGINAL NEWS STORY:

UK’s ICO Issues Notice Against Snapchat’s AI Chatbot

On October 6, 2023, the UK’s Information Commissioner’s Office (ICO) issued a preliminary enforcement notice. The notice was against Snap Inc. and Snap Group Limited, the company behind the social media platform Snapchat. The ICO cited concerns that Snap failed to conduct a sufficient privacy risk assessment before launching its “My AI” chatbot, especially in relation to users aged 13–17 in the UK.

Although the UK currently lacks a dedicated AI law, companies deploying AI technologies are still subject to the UK GDPR and the Data Protection Act 2018.

ICO Findings in Snap My AI Investigation: Teen Privacy Risks

Snapchat’s “My AI,” introduced in February 2023, is a generative AI chatbot embedded within the app. The investigation found that Snap’s original risk assessment did not adequately consider the data risks posed to children and teens.

The preliminary notice stated that Snapchat’s “My AI,” could be required to temporarily suspend data processing related to the chatbot for UK users. However, this would only be enforced if Snap failed to submit a compliant and comprehensive risk assessment that addresses concerns.

Snap My AI ICO Compliance Requirements and Risk Mitigations

The ICO clarified that it is not pursuing a permanent ban on Snapchat’s “My AI.” Instead, it is demanding that Snap conduct a complete and thorough risk review before continuing to offer the service.

Snap has been given the opportunity to respond before the ICO issues any final decision. It’s part of the ICO’s efforts to hold companies accountable under existing data protection laws when deploying AI.

ICO Warning to Generative AI Companies and Privacy Risk Obligations

The ICO emphasized that the notice against Snapchat’s “My AI,” should serve as a warning to the entire tech sector, underscoring that AI tools must be properly assessed for privacy risks—especially when used by minors. The enforcement action follows the ICO’s previous statement outlining how UK data laws apply to generative AI systems.

The ICO’s intervention shows that AI deployments are already subject to regulation in the UK, even in the absence of a standalone AI statute.

Need Help Navigating UK AI and Data Privacy Law?

If your organization is deploying AI tools, especially those involving automated decision-making or user interaction, BABL AI can help. Our Audit Experts provide independent risk assessments, privacy reviews, and compliance strategies aligned with UK and international data protection laws.

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter