TikTok fined nearly $11 million by Italian regulator
Italian authorities fined TikTok $10.9 million on Thursday for fueling the spread of videos likely to harm the “psycho physical safety” of users, according to a press release from the country’s Competition Authority (AGCM).
The fine, the result of a year-long investigation, came a day after the U.S. House of Representatives voted to effectively ban the platform, with members of Congress demanding it divest from its Chinese owner, ByteDance, or be barred from operating in the U.S.
Focusing particularly on how the platform negatively impacts minors and vulnerable populations, AGCM said its probe of the platform’s algorithm came in part as a response to a so-called “French scar” challenge which went viral on the app. The challenge required app users to share videos of facial scarring, causing many to injure their skin to participate.
“TikTok has not taken adequate measures to prevent the dissemination of such content, not fully complying with the guidelines it has adopted and which it has made known to consumers by reassuring them that the platform is a 'safe' space,” the release said.
Moreover, the platform’s guidelines are insufficient, AGCM said, citing the fact that they are applied “without adequately accounting for the specific vulnerability of adolescents, characterized by peculiar cognitive mechanisms from which derive, for instance, the difficulty in distinguishing reality from fiction and the tendency to emulate group behavior.”
The Italian authority also said that TikTok’s spread of content through its recommendation system is also problematic since it is based on what the regulator called “algorithmic user profiling.” The press release said that such profiling causes the platform to target users with videos meant to increase their engagement and drive advertising revenue.
“This causes undue conditioning of users who are stimulated to increasingly use the platform,” the press release said.
The European Commission announced last month that it has launched a probe to determine if TiKTok is violating the continent’s Digital Services Act (DSA) through its failure to verify users’ ages, protect their privacy and prevent addiction to the app. That investigation also is focused on whether the platform violated the DSA through nontransparent advertising practices and failing to protect minors.
TikTok’s algorithms seem to drive “rabbit hole effects” for users, the commission, which is the executive arm of the EU, said at the time. The regulator said it is also focused on the app’s “assessment and mitigation of systemic risks,” and how it handles “actual or foreseeable negative effects.”
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.