EU asks Meta, TikTok to account for their response to Israel-Hamas disinformation
The European Commission sent Meta and TikTok letters Thursday, requesting information on the platforms’ efforts to rein in disinformation relating to the Israel-Hamas war.
The letter cited the platforms’ obligations under Europe’s landmark Digital Services Act (DSA), which regulates the tech behemoths and went into effect in November.
Under the DSA, the companies could be fined up to 6% of their global annual revenue if they are found to be out of compliance. In extreme circumstances, the commission also could ban the platforms from operating in the EU.
The commission said in a press release that the European Union wants to audit the actions the companies have taken to comply with the DSA. It made a similar request of Elon Musk’s X social media platform, previously known as Twitter, last week.
According to the press release, the commission is requesting Meta provide information about measures it has taken to perform risk assessments and propose “mitigation measures” to stop misinformation on the platform as required by the DSA.
The EC said it expects the tech giant to respond with details about how it has met its DSA obligations “following the terrorist attacks across Israel by Hamas, in particular with regard to the dissemination and amplification of illegal content and disinformation.“
The commission also asked Meta for details on how it plans to protect the integrity of elections.
Meta has until October 25th to respond to the EC on its actions relating to the Israel-Hamas war and until November 8 to account for its election work.
Because they are designated as “very large online platforms” by the commission, Meta subsidiaries’ Facebook and Instagram are required to comply with all DSA provisions, including by measuring and mitigating risks related to spreading illegal content, disinformation, and “any negative effects on the exercise of fundamental rights.”
Meta did not immediately respond to a request for comment.
The letter to TikTok sought information on its efforts to stop the spread of illegal content, particularly by spreading “terrorist and violent content and hate speech” in addition to disinformation.
The commission also requested details on TikTok's compliance with other elements of the DSA, specifically related to how it protects minors online.
The company must respond to the EC by Oct. 25.
TikTok did not immediately respond to a request for comment.
A disinformation researcher interviewed by Recorded Future News said the commission’s action shows it is serious about enforcing the DSA — a signal that he called very significant. The European Commission is the EU’s politically independent executive arm. It is responsible for proposing new European legislation, among other things.
“The world no longer has to take platforms’ word for it that they’re doing their best to stop the spread of harmful content — the EU is requesting proof,” said Joseph Bodnar, a research analyst at the Alliance for Securing Democracy at the German Marshall Fund. “That’s a big deal.”
He called the action “a big step towards holding these companies accountable for their roles in the online information environment, which right now is flooded with disinformation and scenes of violence.”
Meta issued a blog post on its website last week saying that within three days of the war beginning it had removed or marked as disturbing 795,000 pieces of content.
The platform also said it established a “special operations center” with fluent Hebrew and Arabic speakers to “closely monitor and respond to this rapidly evolving situation in real time.”
The tech giant updated its post on Wednesday, saying it has also begun changing default settings for comments and restricting who can comment on posts from the region.
TikTok also posted details of their response on its website, saying it too has launched a command center. The platform also said it has evolved its “proactive automated detection systems in real-time” and added more Arabic and Hebrew speaking content moderators.
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.