UK passes the Online Safety Bill — and no, it doesn’t ban end-to-end encryption
The British government’s controversial Online Safety Bill finally completed its passage through parliament on Tuesday, marking the end of one of the most (if not the most) fraught episodes in the United Kingdom’s recent legislative history.
The origin and intention of the new law — which began three prime ministers and seven secretaries of state ago, as a strategy paper on internet safety — is easily forgotten in the more vituperative arguments it provoked in recent months.
But the idea was that certain behaviors online were harmful (child sexual abuse, terrorist propaganda, revenge pornography, fraud, etc.) and that while there were methods to tackle those behaviors in the analog world, the government needed to legislate to get technology companies to help tackle them online.
Of late, companies providing end-to-end encrypted messaging platforms made much noise in opposition to one of the bill’s provisions they claimed would nullify the protections they gave users and expose their messages to interception, even threatening to pull their services out of the country instead of compromising this feature.
There were reports the provision was being scaled back, and then there were criticisms of those reports. But now that the House of Lords has finally finished its business on the bill — and the accompanying cacophony of lobbyists have all departed to the pub — here is what the Online Safety Act (as it will be known once it receives Royal Assent) actually says.
Does it ban end-to-end encryption?
No. The law as passed contains a provision that could require messaging platforms to use “accredited technology” to identify particular kinds of content, particularly terrorism content and child sexual abuse material (CSAM), if they receive a notice to do so by the communications regulator Ofcom.
But in order to issue such a notice, Ofcom has to consider it “necessary and proportionate” and the technology itself must be accredited. No accredited technology currently exists, and Ofcom is yet to set out how it would go about accrediting such technology.
These caveats are huge hurdles for the regulatory regime in terms of justifying a legal request to implement some kind of technology, such as client-side scanning (CSS), on any device before the content is sent.
Back in 2022, researchers from GCHQ, Britain’s cyber and signals intelligence agency, attempted to defend the deployment of CSS-style content checks in a research paper proposing several ways to ensure the security of such a system, although this did not reflect government policy.
Officials from the British government have neither formally endorsed the GCHQ paper nor suggested that its protections would be adopted for any such system. Even if officials were to do so, it is not clear that the messaging companies would agree with its findings.
A similar power to the Online Safety Act’s provision already exists under the U.K.’s Investigatory Powers Act, called a Technical Capability Notice, with similar if not identical caveats limiting how a notice can be served on technology companies.
As reported by Sky News back in 2020, despite numerous and extensive outreach efforts between the British government and the U.S. Congress, there are major challenges facing British attempts to win Washington’s buy-in on these proposals, something which would be needed as the majority of these companies are based in the U.S.
What else does the Online Safety Bill do?
Despite the focus on the provisions regarding end-to-end encryption — hopefully addressed in full above — the 225-page law covers an enormous range of online activities.
It introduces a number of safety and security requirements on platforms that include preventing children from accessing inappropriate content such as pornography through a legal demand on platforms to verify users’ ages.
Platforms are also required to protect content that is of democratic importance, or is journalistic, as the legislation defines, and companies are required to tackle fraudulent advertising and report child sexual abuse material to the National Crime Agency.
Businesses that fail to comply could be fined up to £18 million ($22.3 million) or 10% of their global turnover.
Alexander Martin
is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.