Fraud Management & Cybercrime
,
General Data Protection Regulation (GDPR)
,
Governance & Risk Management
ByteDance-Owned App Fined for Violating Children’s Privacy
TikTok will pay Irish data privacy regulators 345 million euros to settle allegations it violated the privacy of underage users.
See Also: Live Webinar | Best Strategies for Transferring Sensitive Financial Data
The Irish Data Protection Commission determined that TikTok guided young users into setting up accounts visible by default to the public and allowed child user accounts to be paired with unverified non-child users. The agency initiated in 2021 on its own authority an investigation into the short-form video app’s privacy practices for a period covering the last half of 2020.
The decision gives TikTok three months to bring its privacy features in compliance with Europe’s General Data Protection Regulation. The company’s European headquarters is in Dublin. The company has racked up fines for children’s privacy allegations in the United Kingdom and cookie permission GDPR violations in France. Independent analysis has concluded that, at least for adults, TikTok’s data collection is typical of social media platforms. The firm’s Chinese ownership – its parent company is Beijing-based ByteDance – has provoked backlash across Europe and North America and led to governments banning their use on official devices (see: TikTok Says US Threatens Ban Unless Chinese Owners Divest).
A TikTok spokesperson said the company disagreed with Irish Data Protection Commission. “The DPC’s criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default.”
Friday’s decision from the agency comes after the European Data Protection Board in August ordered the Irish DPC to publish a final decision within a month. Disagreements among European data protection authorities over the Irish agency’s September 2022 draft decision led Dublin in May to invoke an intra-European dispute resolution mechanism centered in Brussels (see: Irish DPC Will Conclude TikTok Privacy Probe Within Weeks ).
One disagreement was whether TikTok violated the GDPR’s principle of “fairness” in processing personal data by creating a dark pattern – a user interface designed to trick users into agreeing with a company’s preferred outcome. The final decision finds that TikTok did violate the GDPR’s edict that personal data be processed fairly.
Another disagreement was over whether TikTok violated the GDPR by not implementing by default technical controls for age verification under Article 25 of the regulation. The final decision does find that TikTok violated its responsibilities by not considering the risks posed to users under the age of 13 when it set the default account setting public, allowing anyone to see underage users’ content. But the final decision says that’s a violation of Article 24 which governs the responsibilities of data controls, rather than a violation of GPDR language about data protection by design.