Police warn partnership with tech industry ‘at risk’ over end-to-end encryption
The partnership between law enforcement and the technology industry is “at risk” due to end-to-end encryption, warned a joint declaration on Sunday from European police chiefs.
Although the company was not named in the statement, it follows social media giant Meta announcing in December that it had begun rolling out the technology as default across “all personal chats and calls on Messenger and Facebook.”
End-to-end encryption is a way of designing communications so that the encryption is applied and disapplied by the end devices, making it impossible for the service or network provider — or any hackers accessing the network — to read the contents of messages.
The move by Meta was welcomed by privacy campaigners, however law enforcement agencies had long warned that this lack of access would undermine their work tackling online harms, particularly child sexual exploitation.
At a meeting in London last week, the 32 most senior officers from a range of policing agencies across Europe agreed on a statement repeating their concerns.
The statement, published Sunday, repeats the argument that the way end-to-end encryption is being implemented will hamper police’s ability to investigate serious crimes “such as child sexual abuse, human trafficking, drug smuggling, murder, economic crime and terrorism offences.”
The technology also undermined “the ability of technology companies proactively to identify illegal and harmful activity on their platforms,” the European police chiefs stated.
Police are complaining about the way end-to-end-encryption is implemented, rather than just end-to-end encryption in general, as they argue there are additional technologies such as client-side scanning or adding an additional police “end” to chats that could help address these concerns.
“Our societies have not previously tolerated spaces that are beyond the reach of law enforcement, where criminals can communicate safely and child abuse can flourish. They should not now. We cannot let ourselves be blinded to crime. We know from the protections afforded by the darkweb how rapidly and extensively criminals exploit such anonymity.”
Meta: We make a promise to users
In a detailed post on LinkedIn, the global policy director for Meta’s Messenger app, Gail Kent, explained the company’s approach to end-to-end encryption: “Whilst scanning and reporting from a device may not break the technical encryption protocol, it does break the promise that companies like Meta make to our users.
“This may seem unimportant, but it is vital. Most users - including the most vulnerable - do not understand the cryptography, but they do understand the promise, and expect us to keep it.”
Meta had submitted thousands of reports to the U.S. National Center for Missing & Exploited Children (NCMEC) annually when it detects child predators on its platforms attempting to contact children, alongside millions of reports when users upload media containing child sexual exploitation.
In her LinkedIn article, Kent argued that Meta was unique in doing so because it uniquely scanned its users’ messages of child abuse content.
NCMEC warned that 70% of these reports — and as many as 85% to 92% according to the NCA — could be lost due to the implementation of end-to-end encryption, which they argue will blind Meta’s monitoring teams to content that reveals abusive behaviors.
The company in turn claims that it is able to use other signals such as the metadata of chats and messages to detect predators, similar to the tools it uses with WhatsApp. “It is incorrect to say that these reports will be made up of only metadata - we will still have significant amounts of content, from our public surfaces, as well as from user-reported messaging content,” wrote Kent.
But security officials in Britain have argued that these signals are insufficient, noting that the bar is much lower for Meta to ban users based on suspect signals than it is for law enforcement to prosecute offenders and safeguard children.
According to the British government, Meta’s reports to NCMEC led in 2017 to more than 2,500 arrests and almost 3,000 children being safeguarded in the United Kingdom alone.
“That is in only one country. That is in only one year. That is based on referrals from only one company. That is what we stand to lose,” said Chloe Squires, then the director for national security at the Home Office.
The NCA estimates there are between 680,000 and 830,000 adults in the United Kingdom “that pose some degree of sexual risk to children.”
Meta has introduced a number of safety features to secure the accounts of underage children, and says alongside developing new technologies to protect children it works with industry and nongovernmental organizations to regularly keep these updates.
Alexander Martin
is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.