TikTok could face £27 million fine for failing to protect UK children’s privacy

Social media platform TikTok could face a fine of £27 million after an investigation by the UK’s data protection watchdog found it may have failed to protect children’s privacy.

The Information Commissioner’s Office (ICO) announced on Monday that it had issued TikTok with a “notice of intent” which is a legal document that TikTok is allowed to respond to ahead of a potential fine.

According to the notice TikTok was operating in breach of British data protection laws between May 2018 and July 2020 by processing data belonging to children under the age of 13 without appropriate parental consent.

The social media platform has also been notified that it processed special category data — the most sensitive kind of user data, covering political and religious beliefs, as well as ethnicity and sexuality — without the legal grounds to do so.

The investigation is preliminary and the notice’s findings are provisional, the ICO said, adding it would “carefully consider any representations from TikTok before taking a final decision.”

The UK’s Information Commissioner, John Edwards, said that children should be able to experience the digital world with proper privacy protections.

“Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement,” he said in a statement.

A spokesperson for TikTok stressed the provisional nature of the notice of intent to The Record and added: “While we respect the ICO's role in safeguarding privacy in the UK, we disagree with the preliminary views expressed and intend to formally respond to the ICO in due course.”

Although TikTok requires users to be over the age of 13, a report published by the UK’s communications regulator Ofcom earlier this year suggested that its age restrictions were not being followed.

Ofcom’s report, which was based on a survey of parents, suggested that children as young as three were being allowed to watch videos on the platform.

At the time TikTok responded to say it strictly enforced its minimum age requirements and was committed to protecting young people.

The ICO said it has six ongoing investigations into companies providing digital services which it suspects haven't taken seriously their responsibilities around child safety online.

John Edwards added: “I’ve been clear that our work to better protect children online involves working with organizations but will also involve enforcement action where necessary.”

The regulator has developed an Age Appropriate Design Code to inform online services, from apps to social media sites, about their data protection obligations if their services are likely to be accessed by children.

The code came into force in the UK in 2019 and appeared to force several global changes from social media companies, including Meta's Instagram, which began requiring all users to enter their birthdate to log in — even if they hadn't registered one when signing up.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Alexander Martin

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.