UPDATE – JUNE 2025: The UK Information Commissioner’s Office (ICO) has concluded its investigation into Snapchat’s “My AI” chatbot. The ICO is satisfied that Snap has now conducted a compliant data protection risk assessment and implemented appropriate mitigations. While no enforcement action will be taken at this time, the ICO emphasized that this should serve as a “warning shot” to the industry. A final Commissioner’s decision will be published in the coming weeks.
ORIGINAL STORY:
UK Data Regulator Targets Snap Over “My AI” Privacy Concerns
On October 6, 2023, the UK’s Information Commissioner’s Office (ICO) issued a preliminary enforcement notice. The notice was against Snap Inc. and Snap Group Limited, the company behind the social media platform Snapchat. The ICO cited concerns that Snap failed to conduct a sufficient privacy risk assessment before launching its “My AI” chatbot, especially in relation to users aged 13–17 in the UK.
Although the UK currently lacks a dedicated AI law, companies deploying AI technologies are still subject to the UK GDPR and the Data Protection Act 2018.
ICO: Risk Assessment Was Inadequate for Teen Users
Snapchat’s “My AI,” introduced in February 2023, is a generative AI chatbot embedded within the app. The investigation found that Snap’s original risk assessment did not adequately consider the data risks posed to children and teens.
The preliminary notice stated that Snapchat’s “My AI,” could be required to temporarily suspend data processing related to the chatbot for UK users. However, this would only be enforced if Snap failed to submit a compliant and comprehensive risk assessment that addresses concerns.
Not a Ban—But a Compliance Mandate
The ICO clarified that it is not pursuing a permanent ban on Snapchat’s “My AI.” Instead, it is demanding that Snap conduct a complete and thorough risk review before continuing to offer the service.
Snap has been given the opportunity to respond before the ICO issues any final decision. It’s part of the ICO’s efforts to hold companies accountable under existing data protection laws when deploying AI.
Industry-Wide Implications
The ICO emphasized that the notice against Snapchat’s “My AI,” should serve as a warning to the entire tech sector, underscoring that AI tools must be properly assessed for privacy risks—especially when used by minors. The enforcement action follows the ICO’s previous statement outlining how UK data laws apply to generative AI systems.
The ICO’s intervention shows that AI deployments are already subject to regulation in the UK, even in the absence of a standalone AI statute.
Need Help Navigating UK AI and Data Privacy Law?
If your organization is deploying AI tools, especially those involving automated decision-making or user interaction, BABL AI can help. Our audit experts provide independent risk assessments, privacy reviews, and compliance strategies aligned with UK and international data protection laws.