TTstudio - Fotolia
Online Safety Bill returns to Parliament
MPs and online safety experts have expressed concern about encryption-breaking measures contained in the Online Safety Bill as it returns to Parliament for the first time since its passage was paused in July
The Online Safety Bill has returned to Parliament with a number of amendments, but MPs and online safety experts are still concerned about the impact of encryption-breaking measures on people’s privacy.
Nearly six months after the government delayed its passage over legislative timetabling issues, the Bill returned to the House of Commons on 5 December with a number of changes for MPs to debate.
These include: new criminal offences for assisting or encouraging self-harm online, as well as controlling or coercive behaviour towards women; amendments forcing social media platforms to publish risk assessments on the dangers their services pose to children; further powers for online harms regulator Ofcom to compel greater transparency from companies; and the removal of the controversial “legal but harmful” provision.
The “legal but harmful” aspect of the Bill has attracted significant criticism – from parliamentary committees, campaign groups and tech professionals – over the potential threat it presents to freedom of speech, and the lack of consensus over what constitutes harm online.
Despite the changes to the Bill, however, tech companies could still be required to use software to bulk-scan messages on encrypted services such as WhatsApp before their encryption, which the government justifies as a way to deal with child sexual abuse material and violent crime.
Speaking in the Commons on 5 December, Conservative MP and long-time critic of the Bill’s measures, David Davis, said: “It will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications.”
Davis added that although the language used “sounds innocuous and legalistic”, clause 104 causes pressure by requiring real-time decryption. “The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a backdoor,” he said.
Similar sentiments were expressed by other MPs, including Conservative Adam Afriyie, who said: “We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse – all the good things that we want in society – on the basis of a tiny minority of very bad people who need to be caught.”
Davis and three other MPs filed an amendment to the Bill in July 2022, asking for the language to be adjusted in a way that “removes the ability to monitor encrypted communications”.
Bill ‘would not be lawful under UK common law’
In an independent legal opinion published on 29 November, Matthew Ryder KC and barrister Aidan Wills, both of Matrix Chambers, found that the powers conceived of in the Bill would not be lawful under UK common law and the existing human rights legal framework.
They wrote: “The Bill, as currently drafted, gives Ofcom the powers to impose Section 104 notices on the operators of private messaging apps and other online services. These notices give Ofcom the power to impose specific technologies (eg algorithmic content detection) that provide for the surveillance of the private correspondence of UK citizens. The powers allow the technology to be imposed with limited legal safeguards.
“It means the UK would be one of the first democracies to place a de facto ban on end-to-end encryption for private messaging apps. No communications in the UK – whether between MPs, between whistleblowers and journalists, or between a victim and a victims support charity – would be secure or private.”
Responding to the concerns of Davis and others, digital minister Paul Scully said: “We are not talking about banning end-to-end encryption or about breaking encryption.” He added that Davis’s amendment “would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that”.
Former home secretary Priti Patel, who tabled amendments to the Bill that Davis was referring to in July 2022, said: “While there is great justification for encryption…the right measures and powers [need to be] in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption.”
During the same session, Labour MP Sarah Champion brought up the use of virtual private networks (VPN), arguing that such tools – which allow internet users to encrypt their connections to mask their locations and identities from websites by routing the data via servers located elsewhere in the world – could help people bypass the Bill’s measures, such as age verification.
“If companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed,” she said. “I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet.
“It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. If VPNs cause significant issues, the government must identify those issues and find solutions, rather than avoiding difficult problems.”
Computer Weekly contacted the Labour leadership about whether it would support measures to limit the use of VPNs.
Read more about encryption and online safety
- The Information Commissioner’s Office has stepped into the debate over end-to-end encryption (E2EE), warning that delaying its introduction leaves everyone at risk – including children.
- Proposals in the Online Safety Bill to give the telecoms regulator Ofcom powers to mandate technology companies to use scanning software to monitor encrypted messages for illegal content should be dropped, it was claimed this week.
- The Online Safety Bill needs to be “completely overhauled” to protect freedom of speech and privacy in the UK, says a coalition of civil society groups in an open letter to the UK government.
A Labour spokesperson said: “VPNs were a small part of the discussion at Report Stage, and the issue is not likely to be revisited during the Bill’s passage. Sarah Champion was not proposing to review VPNs in their entirety. She was raising a specific issue with the government about whether VPNs could be used to access, even by accident, child sexual abuse imagery which would otherwise be automatically blocked.
“Labour agreed that if there is a risk of this happening, Ofcom should look into it. However, there was no vote on her amendment and its purpose was to make the government aware of a potential loophole.”
The spokesperson added that Labour is opposed to the removal of the “legal but harmful” clause, which, it argues, goes “against the very essence” of the Bill.
“The Online Safety Bill was created to address the particular power of social media – to share, spread and broadcast around the world very quickly,” said the spokesperson. “Disinformation, abuse, incel gangs, body-shaming, Covid and holocaust denial, scammers, the list goes on – are all actively encouraged by unregulated engagement algorithms and business models which reward sensational, extreme, controversial and abusive behaviour.”
Following the reintroduction of the Bill to Parliament, the House of Lords Communications and Digital Committee held a special evidence session abouts its measures on 6 December.
The attending experts raised concerns about various aspects of the Bill, including the risks associated with allowing private companies to determine or infer what is illegal, the removal of risk assessment transparency obligations regarding the safety of adults online, and the lack of minimum requirement for platforms’ terms of service, but Edina Harbinja, a senior lecturer in media and privacy law at Aston Law School, emphasised the threat to encryption.
Noting that about 40 million people in the UK use encrypted messaging service WhatsApp, for example, Harbinja said that compromising those communications by, for example, mandating client-side scanning of pre-encrypted content “is not a proportionate step”.
She added that, as currently drafted, the Bill poses an “unacceptable threat to encryption and the security of the internet, and the networks that we all rely on in our day-to-day activities, our communication, our banking, etc”.
Speaking at TechUK’s digital ethics summit on 7 December during a session on the Bill, Arnav Joshi, a senior associate at Clifford Chance’s tech group, said that although there is a balance to be struck between privacy and, for example, preventing terrorism, “adding things like exceptions and backdoors” would essentially break encryption for internet users. “I’m not sure that baking something like that into law is the right approach,” he added.
Alternatives ‘haven’t been fully explored’
Joshi said alternative solutions for how organisations can figure out who is viewing and sharing certain content “haven’t been fully explored”, and that any backdoors on encryption would make it “unlikely” that a reasonable balance would be struck between competing rights.
But despite ongoing concerns about the future of encryption, the government has already started leveraging resources to undermine the technology.
In November 2021, for example, it announced the five winners of its Safety Tech Challenge Fund, who each received £85,000 to help them advance their technical proposals for new digital tools and applications to stop the spread of child sexual abuse material (CSAM) in encrypted environments.
Speaking with Computer Weekly at the time, then digital minister Chris Philp said the government would not mandate any scanning that goes beyond the scope of uncovering child abuse material, and further claimed the systems developed would only be capable of scanning for that particular kind of content.
“These technologies are CSAM-specific,” he said. “I met with the companies two days ago and with all of these technologies, it’s about scanning images and identifying them as either being previously identified CSAM images or first-generation created new ones – that is the only capability inherent in these technologies.”
Asked whether there was any capability to scan for any other types of image or content in messages, Philp said: “They’re not designed to do that. They’d need to be repurposed for that, as that’s not how they’ve been designed or set up. They’re specific CSAM scanning technologies.”
This sentiment was echoed by Scully in the Commons on 5 December. “The Bill is very specific with regard to encryption – this provision will cover solely CSAM and terrorism. It is important that we do not encroach on privacy.”
Three of the companies working on the project told Computer Weekly in January 2022 that pre-encryption scans for such content – also known as client-side scanning – can be carried out without compromising privacy.
Apple attempted to introduce client-side scanning technology – known as Neural Hash – to detect known child sexual abuse images on iPhones last year, but the plans were put on indefinite hold after an outcry by tech experts.
A report by 15 leading computer scientists, Bugs in our pockets: the risks of client-side scanning, published by Columbia University, identified multiple ways that states, malicious actors and abusers could turn the technology around to cause harm to others or society.
Read more on IT legislation and regulation
-
Vendors struggle to prevent GenAI use in child sexual abuse
-
Ofcom publishes draft online child safety rules for tech firms
-
Tech firms cite risk to end-to-end encryption as Online Safety Bill gets royal assent
-
Parliament passes sweeping Online Safety Bill but tech companies still concerned over encryption