UK regulators demand social media platforms make it harder for kids under 13 to access sites
Two U.K. regulators on Thursday published warnings demanding that Facebook, Instagram, Snapchat, TikTok, YouTube and other large platforms used by children “take urgent steps” to integrate robust age assurance tools into their sites.
The Information Commissioner’s Office (ICO) and Ofcom stressed that they expect immediate action, with Ofcom saying that firms have until the end of April to report back on their plans. The ICO said that it has “started direct engagement with some of the highest risk services and expect them to work directly with us to strengthen their age assurance measures over the next two months.”
The regulators’ public call to action comes at a time when countries across Europe are considering or are implementing social media bans for children and are generally laser-focused on child safety online.
In January, the British government announced it is considering a social media ban for children under age 16 and said it is consulting with Australia to learn about the impact and efficacy of its own ban, which took effect in December. On Monday members of Parliament voted down a ban, but it could still take effect after the British government finishes an ongoing “consultation” process.
In its open letter, the ICO said it is considering “further regulatory action” if platforms do not do more to ensure that children under age 13 cannot access their platforms.
The ICO said it has found that many platforms set a minimum age of 13 but rely on children to honestly report their ages as their sole enforcement mechanism.
“As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them,” the ICO letter said. “This puts under-13s at risk by allowing their information to be collected and used unlawfully, without the protections they are entitled to.”
The regulator emphasized that age assurance technologies have become much more effective in recent years but that many services have failed to begin using the technology.
Ofcom’s warning said that social media platforms and Roblox have privately assured it that they are committed to creating safe online ecosystems for kids. The regulator said it plans to make the companies’ responses to its demand for action public in May and will then “announce any next steps for regulatory action.”
“These online services are household names, but they’re failing to put children’s safety at the heart of their products,” Dame Melanie Dawes, Ofcom’s chief executive, said in a statement. “There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.”
Ofcom’s four demands include a call for platforms to implement effective age assurance protocols, “failsafe” grooming protections, safer feeds and no more product testing on children.
The regulator said its research shows that 72% of children aged 8-12 are accessing the platforms’ sites and apps.
Suzanne Smalley
is a reporter covering digital privacy, surveillance technologies and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.



