Regulatory

Global: TikTok Faces €345 Million Fine for Violating EU Data Regulations on Children’s Accounts

0
IMG 6753
Share this article

TikTok is grappling with a €345 million (£296 million) fine for breaching EU data regulations concerning the handling of children’s accounts, including the failure to protect underage users’ content from public visibility.

The Irish data regulator, responsible for overseeing TikTok’s operations across the EU, has determined that the Chinese-owned video platform committed multiple infractions of GDPR rules.

Among its findings, the regulator concluded that TikTok violated GDPR by automatically setting child users’ accounts to public view by default, failing to provide clear information to child users, allowing adults accessing a child’s account through the “family pairing” feature to enable direct messaging for users over 16, and inadequately assessing the risks faced by under-13s who were placed on a public setting.

The Irish Data Protection Commission (DPC) observed that users aged 13 to 17 were directed through the sign-up process in a manner that resulted in their accounts being set to public view by default, making their content accessible to anyone. Additionally, the “family pairing” scheme, which grants an adult control over a child’s account settings, did not verify whether the adult “paired” with the child user was a parent or guardian.

The DPC ruled that TikTok, which has a minimum user age of 13, failed to adequately consider the risks posed to underage users who accessed the platform. The default public setting allowed anyone to view content posted by these users.

Although the Duet and Stitch features, which permit users to combine their content with others on TikTok, were enabled by default for users under 17, the DPC did not find any GDPR violations concerning methods used to verify users’ ages.

This decision by the DPC follows TikTok being fined £12.7 million in April by the UK data regulator for unlawfully processing the data of 1.4 million children under 13 who used the platform without parental consent. The Information Commissioner stated that TikTok had made minimal efforts, if any, to verify the identity of platform users.

TikTok responded by pointing out that the investigation examined the company’s privacy practices between July 31 and December 31, 2020, and noted that it had already addressed the issues raised by the inquiry. Since 2021, all existing and new TikTok accounts for users aged 13 to 15 have been automatically set to private, meaning only individuals approved by the user can view their content.

TikTok also expressed disagreement with the DPC’s decision, particularly regarding the magnitude of the fine imposed. The company argued that the criticisms primarily pertained to features and settings in place three years ago, which were altered before the investigation commenced, such as setting all under-16 accounts to private by default.

Additionally, the DPC acknowledged that it had been overruled on certain aspects of its decision by the European Data Protection Board, a body composed of data and privacy regulators from EU member states. This forced the DPC to incorporate a proposed finding from the German regulator, which asserted that TikTok’s use of “dark patterns” (deceptive website and app designs that influence user behavior or choices) violated a GDPR provision related to the fair processing of personal data.

Share this article

Global: Fintech Association for Consumer Empowerment Enhances Consumer Protection with New Code of Conduct

Previous article

Somalia Introduces Digital National Identification Cards for Its Citizens

Next article

You may also like

Comments

Comments are closed.

More in Regulatory