Snapchat My AI analyzes everything you share—even "private"
snaps. In April 2023, Snapchat rolled out "My AI" to all users. By 2024, regulators on
both sides of the Atlantic were investigating. The Investigations UK ICO (2023-2024): Opened investigation for failing to assess data protection risks
Issued preliminary enforcement notice threatening to shut down My AI
Snapchat submitted five revisions of their privacy assessment before compliance US FTC: Complaint alleging risks to young users
Investigation into potential violations of minors' data privacy laws What My AI Knows The chatbot doesn't just read your messages. It analyzes: Background images in photos you send
Location data from your snaps
Conversation patterns and emotional states
Friend connections and relationship dynamics
Purchase intent from discussions One researcher found My AI knew the names of their cats—despite never mentioning
them. Another teen discovered the AI knew their state from analyzing a photo. The Teen Problem My AI was launched without appropriate safeguards: Chatted with minors about covering up drug use
Discussed intimate topics with underage users
No meaningful age-gating at launch January 2024: Snapchat finally added parental controls allowing restrictions
on My AI access. Data Training Snapchat's privacy policy states they may use: Public Spotlight posts
Public Stories
Snap Map content ...to train their generative AI models. Your "private" account may be training
the next version. Staying Safe For Teens: Never share sensitive info with My AI
Use the "Clear from Chat Feed" option
Ask parents to restrict My AI access For Parents: Enable Family Center parental controls
Restrict My AI interactions
Discuss AI privacy with your children The Bottom Line: My AI isn't your friend. It's a data harvester wearing a
smiley face.