Snapchat, a popular social media platform among young users, is facing potential legal action and a substantial fine in the UK. The Information Commissioner's Office (ICO) has issued a preliminary enforcement notice, alleging that Snapchat failed to properly evaluate the privacy risks associated with its AI chatbot, My AI, especially concerning minors.
The ICO provisionally found that Snapchat's owner, Snap, did not sufficiently identify and assess the privacy risks posed by My AI, which serves several million UK users, including 13- to 17-year-olds.
The ICO's focus is on safeguarding children's privacy, given Snapchat's popularity among younger demographics. Approximately 18% of UK users fall within the 12 to 17 age group.
Snapchat has until October 27 to respond to the ICO's preliminary findings. A final decision regarding potential enforcement action will be made after considering Snap's representations.
If a final enforcement notice is issued, Snap may be compelled to cease processing data related to My AI until an "adequate risk assessment" is conducted.
The ICO has the authority to impose fines of up to 4% of Snap's global turnover, amounting to a maximum of £17.5 million. This penalty underscores the importance of addressing privacy risks in AI applications.
My AI relies on OpenAI's GPT technology, a prominent player in the global AI landscape.
Snap claims that My AI underwent a rigorous legal and privacy review process before being launched. It emphasizes its commitment to user privacy.
Snapchat's potential fine in the UK highlights the growing importance of privacy considerations in AI development, particularly when targeting younger users. As platforms integrate AI-powered features, addressing privacy risks becomes paramount to ensure compliance with data protection laws.
Read more about Snapchat: