EU panel fines TikTok €345 million for child settings
The Irish Data Protection Commission (DPC) fined social media giant TikTok €345 million ($368 million) on Friday for violations of European Union privacy regulations related to how the platform dealt with its child users.
Investigators looked at TikTok’s policies from July 31 to December 31, 2020, finding that the platform made the accounts of children public by default. TikTok also allowed parents or guardians to make other changes through linked accounts, but without verifying the relationship between the adult and child, investigators found.
The regulators define child users as those between the ages of 13 and 17. A spokesperson from the DPC did not respond to questions about why that time period in 2020 was chosen. The investigation began in September 2021 but specifically focused on that six-month stretch of time.
In addition to a reprimand and the fine, TikTok also will be given three months “to bring its processing into compliance” with the EU’s General Data Protection Regulation (GDPR).
A TikTok spokesperson told Recorded Future News that it disagrees with the decision, “particularly the level of the fine imposed.” The spokesperson said the company is still deciding what next steps would be taken.
“The DPC's criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default,” the spokesperson said.
European regulators now have a history of levying hefty fines against tech giants, slapping Meta with a $1.3 billion fine in May and Amazon with an $800 million fine in 2021. The EU has previously fined Instagram, WhatsApp and Facebook — all of which are owned by Meta — as well as Google.
The investigators found that when children signed up to use TikTok, their accounts were automatically made public by default and comments on their videos were public by default alongside other features exposing them to adults.
A feature called “Family-Pairing,” which allowed the accounts children to be linked to the accounts of parents or guardians, was also at odds with GDPR regulations because TikTok did not check whether this person was actually the child’s parent or guardian. The non-child user could enable direct messages to the child users above the age of 16.
TikTok was also in violation of other GDPR rules for a failure to verify a child’s age during the registration process, potentially exposing the personal data of children under the age of 13.
Changes in place
“We believe our settings have always given users control over whether to choose a public or private account, but in January 2021 (eight months before the DPC launched its investigation), we became the first major platform to make all existing and new accounts for 13 to 15 year olds private by default,” Fox claimed.
Fox said TikTok now limits who can comment on videos made by anyone between the ages of 13 and 15 — only offering a choice between “Friends” or “No One.” They have also made the “Suggest your account to others” setting off by default for 13- to 15-year-olds.
TikTok also plans to roll out a “redesigned” account registration process for 16- and 17-year-old users that will be pre-selected to a “private account.”
The Family Pairing feature no longer allows adult users to enable direct messaging for 16 and 17-year-olds if these users had already disabled it. Fox does not address the issue raised by regulators – that there is no way to verify whether someone is an actual parent or guardian – but says the linked adult accounts now allow users to put daily screen time limits, mute notifications and filter content.
Fox claimed that the platform now has a team that monitors accounts suspected of being run by underage users. TikTok reportedly removed almost 17 million accounts believed to be run by underage users in the first three months of 2023.
Ireland’s commission is the lead agency for privacy regulations in the EU. The DPC made its final decision on September 1, 2022 and submitted the decision to other data privacy regulators in Europe on September 13, 2022. There was “broad consensus” about their findings, but GDPR rules allow regulators from other EU nations to chime in on potential mistakes or violations that are missing.
Officials in Italy and Germany felt the inquiry had ignored other violations related to “dark patterns” — manipulative online designs that trick users into signing up for services they don’t want — and the age verification issue.
The European Data Protection Board mediated the dispute and decided on August 3, 2023 to include a reference to the age verification issue while leaving out the concern about dark patterns. The decision was officially adopted on September 1, 2023.
The platform is used by about 134 million people across the European Union.
is a Breaking News Reporter at Recorded Future News. Jonathan has worked across the globe as a journalist since 2014. Before moving back to New York City, he worked for news outlets in South Africa, Jordan and Cambodia. He previously covered cybersecurity at ZDNet and TechRepublic.