UK regulator to probe Telegram, teen chat sites for potential child safety violations
The UK’s online safety regulator said Tuesday that it has opened an investigation into Telegram based on evidence that the platform has allegedly facilitated the sharing of child sexual abuse material (CSAM).
The Telegram probe was launched after the Canadian Centre for Child Protection shared information that allegedly showed CSAM is present and shared on the platform, according to a press release from the regulator, Ofcom. After the Canadian group shared the evidence, Ofcom conducted its own assessment of the messaging service and determined that an investigation was warranted.
The investigation is specifically probing whether Telegram has violated the UK’s Online Safety Act. The law requires providers of what Ofcom calls user-to-user services to monitor and address the risks of child sexual abuse and exploitation activities on their platforms.
Ofcom also announced a probe of two teen chat sites, Chat Avenue and Teen Chat, to determine whether they are doing enough to protect children from being groomed by child predators. The Chat Avenue investigation also will examine whether the service is taking sufficient steps to prevent children from seeing harmful content like pornography, Ofcom said.
A spokesperson for Telegram said in a statement that the platform has “virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with NGOs.”
“Telegram categorically denies Ofcom’s accusations,” the statement said. “We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.”
Teen Chat and Chat Avenue did not immediately respond to requests for comment.
Ofcom said it launched its probe of the chat sites based on feedback from child protection agencies it works with to pinpoint services that may be facilitating grooming.
Teen Chat and Teen Avenue both have open chatrooms and offer private messaging and media sharing options.
Ofcom said it has spoken to the chat sites about their practices and remains unsatisfied with their responses about how they fight grooming.
“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,” Suzanne Cater, director of enforcement at Ofcom, said in a statement.
“It’s why we work so closely with partners in law enforcement and child protection organizations to identify where these harms are occurring and hold providers to account where they’re failing to meet their obligations.”
If Ofcom’s investigation determines that the firms have violated the law, it will release a provisional decision which the companies will have a chance to respond to before a final determination is made.
When companies are found to have broken the Online Safety Act, Ofcom has the power to force platforms to make changes and can levy fines of up to £18 million ($24.3 million) or 10% of qualifying worldwide revenue.
Suzanne Smalley
is a reporter covering digital privacy, surveillance technologies and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.



