eu
Image: Alain Rolland / European Parliament

Big tech vows to continue CSAM scanning in Europe despite expiration of law allowing it

A European Union law allowing tech companies to scan communications for child sexual abuse materials (CSAM) expired Saturday, but several tech giants have vowed to continue the scanning despite the potential legal risk they now face.

Microsoft, Google, Meta and Snapchat released a statement on Friday saying they “reaffirm their continued commitment to protecting children and preserving privacy, and will continue to take voluntary action” to complete the scans.

The tech giants’ statement linked to a letter signed by 247 child safety organizations decrying lawmakers’ decision to let the law allowing scanning expire.

“Europe risks leaving children across the globe less protected from the most abhorrent harm,” the tech firms’ statement said.

European officials have cautioned that the scanning now violates European Union law. 

“Without a legal basis, companies are no longer allowed to proactively detect child sexual abuse in private communications,” Commission spokesperson Guillaume Mercier said in a statement provided to Politico. Mercier did not immediately respond to a request for comment.

The decision to let the law allowing scanning expire was hotly contested. Critics of the law said the scanning allowed indiscriminate surveillance and represents a huge privacy violation. 

But law enforcement officials, several European commissioners and German Chancellor Friedrich Merz all strongly supported maintaining legal protections for continuing the scans.

Catherine De Bolle, the executive director of Europol, published a statement saying that CSAM has been on the rise and that law enforcement will now be hobbled as they try to combat its spread.

The two sides are dug in and finding a compromise has been elusive. Lawmakers have been negotiating to find a permanent solution since November 2023 but have been unable to agree on terms.

The tech giants have previously expressed alarm about the expiration of the law allowing them to scan.

“Failure to act will reduce the legal clarity that has enabled companies for nearly 20 years to voluntarily detect and report known child sexual abuse material (CSAM) in interpersonal communication services, leaving children across Europe and around the world with fewer protections than they had before,” a March 19 statement from Google, Snapchat, Microsoft, Meta and TikTok said.

Critics have said that the tools used for scanning have led to false accusations of abuse, but the tech companies assert that their tools for detecting CSAM are foolproof. The detection regime involves the use of hash matching that links known CSAM with one of a kind hashes of previously identified material stored in a database.

“The system ensures high-precision detection while adhering to privacy principles,” the tech giants said in the March 19 statement.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
Recorded Future
No previous article
No new articles
Suzanne Smalley

Suzanne Smalley

is a reporter covering digital privacy, surveillance technologies and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.