Pawel Pajor - stock.adobe.com
Tech firms cite risk to end-to-end encryption as Online Safety Bill gets royal assent
Tech firms continue to be concerned that the Online Safety Bill could undermine end-to-end encryption despite government reassurances
The government’s controversial Online Safety Bill has become law amid continued concerns from tech companies that it could damage the privacy of encrypted communications.
The Online Safety Act, which aims to make the internet safer for children, received royal assent in Parliament on 26 October 2023.
The act places legal duties on technology companies to prevent and rapidly remove illegal content, such as terrorism and revenge pornography.
It also requires technology companies to protect children from seeing legal but harmful material, including content promoting self-harm, bullying, pornography and eating disorders.
The communications regulator, Ofcom, will have new powers to fine technology companies that fail to comply with the act up to £18m or 10% of their turnover, whichever is greater, which means the biggest tech companies could be fined billions.
The government has estimated that 100,000 online services will come under the Online Safety Act, with the most stringent obligations reserved for “Category 1” services that have the highest reach and pose the highest risk.
Technology secretary Michelle Donelan said the Online Safety Act would ensure online safety for decades to come. “The bill protects free speech, empowers adults and will ensure that platforms remove illegal content,” she said.
End-to-end encryption
But the Online Safety Act, which has taken four years to reach the statute books, continues to raise concerns for technology companies over provisions that could undermine encrypted communications.
Encrypted messaging and email services, including WhatsApp, Signal and Element, have threatened to pull out of the UK if Ofcom requires them to install “accredited technology” to monitor encrypted communications for illegal content.
Section 122 of the act gives Ofcom powers to require technology companies to install systems that they argue would undermine the security and privacy of encrypted services by scanning the content of every message and email to check whether they contain child sexual abuse materials (CSAM).
‘Catastrophic impact’ on privacy
Mathew Hodgson, CEO of Element, a secure communications provider that provides comms services to the Ministry of Defence, the US Navy, Ukraine and Nato, said its customers were demanding guarantees that the company would not implement message scanning if required to do so under the Online Safety Act.
“Some of our larger customers are contractually requiring us to commit to not putting any scanning technology into our apps because it would undermine their privacy, and we are talking about big reputable technology companies here. We are also seeing international companies doubting whether they can trust us as a UK-based tech supplier anymore,” he said.
Speaking on BBC Radio 4, Hodgson said the intentions of the bill were obviously good and that social media companies such as Instagram and Pinterest should be filtering posts for child abuse material.
However, giving Ofcom the power to require blanket surveillance in private messaging apps would “catastrophically reduce safety and privacy for everyone”, he said.
Hodgson said enforcement of Section 122 of the Online Safety Act against technology companies would introduce new vulnerabilities and weaknesses to encrypted communications systems that would be exploited by attackers.
“It is like asking every restaurant owner in the country to bug their restaurant tables – in case criminals eat at the restaurants – and then holding the restaurant owners responsible and liable for monitoring those bugs,” he said.
The CEO of encrypted mail service Proton, Andy Yen, said that without safeguards to protect end-to-end encryption, the Online Safety Act poses a real threat to privacy.
“The bill gives the government the power to access, collect and read anyone’s private conversations any time they want. No one would tolerate this in the physical world, so why do we in the digital world?” he said.
Writing in a blog post published today (27 October 2023), Yen said while he was reasonably confident that Ofcom would not use its powers to require Proton to monitor the contents of its customers’ emails, he was concerned that the act had been passed with a clause that gives the British government powers to access, collect and read anyone’s private communications.
“The Online Safety Act empowers Ofcom to order encrypted services to use “accredited technology” to look for and take down illegal content. Unfortunately, no such technology currently exists that also protects people’s privacy through encryption. Companies would therefore have to break their own encryption, destroying the security of their own services,” he wrote.
“The criminals would seek out alternative methods to share illegal materials, while the vast majority of law-abiding citizens would suffer the consequences of an internet without privacy and personal data vulnerable to hackers,” he added.
Meredith Whittaker, president of encrypted messaging service Signal, posted the organisation’s position on X, formerly known as Twitter, that it would withdraw from the UK if it was forced to compromise its encryption.
“Signal will never undermine our privacy promises and the encryption they rely on. Our position remains firm: we will continue to do whatever we can to ensure people in the UK can use Signal. But if the choice came down to being forced to build a backdoor, or leaving, we’d leave,” she wrote.
Zero-tolerance approach
The Online Safety Act takes what the government describes as a “zero-tolerance approach” to protecting children.
It includes measures to require tech companies to introduce age-checking measures on platforms where harmful content to children is published, and requires them to publish risk assessments of the dangers posed to children by their sites.
Tech companies will also be required to provide children and parents with clear ways to report problems, and to offer users options to filter out content they do not want to see.
Ofcom plans phased introduction
The communications regulator plans to introduce the legislation in phases, starting with a consultation process on tackling illegal content from 9 November 2023.
Phase two will address child safety, pornography, and the protection of women and girls, with Ofcom due to publish draft guidance on age verification in December 2023. Draft guidelines on protecting children will follow in spring 2024, with draft guidelines on protecting women and girls following in spring 2025.
Phase three will focus on categorised online services that will be required to meet additional requirements, including producing transparency reports, providing tools for users to control the content they see and preventing fraudulent advertising. Ofcom aims to produce draft guidance in early 2024.
Ofcom’s chief executive, Melanie Dawes, said it would not act as a censor, but would tackle the root causes of online harm. “We will set new standards online, making sure sites and apps are safer by design,” she added.
Advice to tech companies
Lawyer Hayley Brady, partner at UK law firm Herbert Smith Freehills, said technology companies should engage with Ofcom to shape the codes of practice and guidance.
“Companies will have the choice to follow Ofcom’s Codes of Practice or decide upon their own ways of dealing with content. Unless a company has rigorous controls in place, the safe option will be to adhere to Ofcom’s advice,” she said.
Ria Moody, managing associate at law firm Linklaters, said the Online Safety Act tackles the same underlying issues as the European Union’s Digital Services Act (DSA), but in a very different way.
“Many online services are now thinking about how to adapt their DSA compliance processes to meet the requirements of the OSA,” she said.
John Brunning, a partner at law firm Fieldfisher, said the broad scope of the act meant many more businesses would be caught by its provisions than people expected.
“Expect plenty of questions when it comes to trying to implement solutions in practice,” he said.
These include how likely a service is to be accessed by children, whether companies will need to start geo-blocking to prevent people accessing sites that are not targeted at the UK, and where technology companies should draw the line on harmful content.
Franke Everitt, director at Fieldfisher, said online platforms and businesses would not need to take steps to comply immediately. “This is just the beginning of a long process. Government and regulators will need to fill in the detail of what is just a roughly sketched outline of legislation,” she said.
Read more about the debate over end-to-end encryption
- Parliament passes sweeping Online Safety Bill but tech companies still concerned over encryption.
- Technology companies say reassurances by government ministers that they have no intention of weakening end-to-end encrypted communication services do not go far enough.
- BCS, The Chartered Institute for IT, argues the government is seeking a technical fix to terrorism and child abuse without understanding the risks and implications.
- Government boosts protection for encryption in Online Safety Bill but civil society groups remain concerned.
- CEO of encrypted messaging service Element says Online Safety Bill could pose a risk to the encrypted comms systems used by Ukraine.
- Tech companies and NGOs urge rewrite of Online Safety Bill to protect encrypted comms.
- Protecting children by scanning encrypted messages is ‘magical thinking’, says Cambridge professor.
- Proposals for scanning encrypted messages should be cut from Online Safety Bill, say researchers.
- GCHQ experts back scanning of encrypted phone messages to fight child abuse.
- Tech companies face pressure over end-to-end encryption in Online Safety Bill.
- EU plans to police child abuse raise fresh fears over encryption and privacy rights.
- IT professionals wary of government campaign to limit end-to-end encryption.
- John Carr, a child safety campaigner backing a government-funded campaign on the dangers of end-to-end encryption to children, says tech companies have no choice but to act.
- Information commissioner criticises government-backed campaign to delay end-to-end encryption.
- Government puts Facebook under pressure to stop end-to-end encryption over child abuse risk.
- Former UK cyber security chief says UK government must explain how it can access encrypted communications without damaging cyber security and weakening privacy.
- Barnardo’s and other charities begin a government-backed PR campaign to warn of dangers end-to-end encryption poses to child safety. The campaign has been criticised as ‘one-sided’.
- Apple’s plan to automatically scan photos to detect child abuse would unduly risk the privacy and security of law-abiding citizens and could open up the way to surveillance, say cryptographic experts.
- Firms working on UK government’s Safety Tech Challenge suggest scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain.
- Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent, claims NSPCC.
- Proposals by European Commission to search for illegal material could mean the end of private messaging and emails, says MEP.