Pawel Pajor - stock.adobe.com

‘Legal but harmful’ clause dropped from Online Safety Bill

Online Safety Bill’s ‘legal but harmful’ provision will be dropped by the UK government in favour of public risk assessments, tools to help users control the content they consume, and new criminal offences around self-harm

The UK government has proposed replacing the Online Safety Bill’s controversial “legal but harmful” provision with new duties to increase the accountability of the tech giants.

Under the Bill’s previous duty of care, tech platforms that host user-generated content or allow people to communicate would have been legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content – such as child sexual abuse, terrorism and suicide material – or risk being fined up to 10% of their turnover by the online harms regulator, confirmed to be Ofcom.

The “legal but harmful” aspect of the Bill has, however, attracted significant criticism – from Parliamentary committees, campaign groups and tech professionals – over the potential threat it presents to freedom of speech, and the lack of consensus over what constitutes harm online.  

While firms will still need to protect children and remove content that is illegal or prohibited in their terms of service, the Bill will no longer seek to define specific types of legal content that companies must address.

As a result, companies will not be able to remove or restrict legal content, or otherwise suspend users for posting or sharing that content.

Instead, organisations within Category 1 – which refers to services with the highest risk functionalities and the highest user-to-user reach – will only be legally required to remove illegal content, take down material in breach of their own terms of service, and provide adults with tools that allow them to exercise greater choice over the content they see and engage with.

The government has also confirmed that a new criminal offence of assisting or encouraging self-harm online will be included in the Bill, as well as further amendments intended to strengthen protection for children.

These include an amendment to force social media platforms to publish the risk assessments they have conducted on the dangers their services pose to children (the previous version of the Bill required firms to carry out these assessments, but not proactively publish them); and another forcing platforms to provide clarity on age-appropriate child protection by making them explicitly set out how they intend to enforce these protections.

“Unregulated social media has damaged our children for too long and it must end,” said digital secretary Michelle Donelan.

“I will bring a strengthened Online Safety Bill back to Parliament which will allow parents to see and act on the dangers sites pose to young people. It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views.

“Young people will be safeguarded, criminality stamped out and adults given control over what they see and engage with online. We now have a binary choice: to get these measures into law and improve things, or squabble in the status quo and leave more young lives at risk.”

Other new measures intended to boost transparency and accountability include giving Ofcom the power to require companies to publish details of enforcement actions it has take against them, and another new criminal offence of controlling or coercive behaviour against women and girls, which will be in the list of “priority offences”.

Offences already contained in the priority list include terrorism, child sexual abuse, revenge porn, hate crime, fraud, the sale of illegal drugs or weapons, the promotion or facilitation of suicide, people smuggling, and sexual exploitation.

Read more about online safety

The victims commissioner, domestic abuse commissioner and children’s commissioner will also be added as statutory consultees in the Bill, which means Ofcom must consult with each when drafting compliance codes for tech firms.

The Bill is due to return to the House of Commons on 5 December 2022, after its passage was paused by the government in July 2022 following legislative timetabling issues.

The House of Lords Communications and Digital Committee is set to hold a special evidence session for the Bill on 6 December.

“Getting online safety right is essential and complex,” said committee chair Baroness Stowell ahead of the session. “Today’s announcement that the provisions on legal but harmful content will be removed from the Bill marks a major shift.

“Our evidence session will examine whether the government’s new approach to the Online Safety Bill delivers what is needed and strikes the right balance between protecting children online and freedom of speech.”

Commenting on the changes to the Bill, Lucy Powell, Labour’s shadow digital secretary, said: “The government has bowed to vested interests, rather than keeping users and consumers safe.

“These changes are a major weakening of this Bill, undermining its core purpose. The government is giving abusers a free pass and taking the public for a ride.”

Speaking at the UK Internet Governance Forum on 1 November 2022, Powell said that reining in the harmful business models of tech firms is a matter of urgency, and the legislation’s focus should be on controlling the algorithmic distribution of harmful content, rather than policing individual posts on social media.

“The challenge is the engagement algorithms and how that works,” she said. “I am less interested in the person who’s put a post on social media about suicide – if no one really sees it, then it’s not causing harm. I’m more interested in the fact that people are encouraged to see it.”

Despite the changes to the Bill, tech companies could still be required to use software to bulk-scan messages on encrypted services such as WhatsApp before their encryption, which the government justifies as a way to deal with child sexual abuse material and violent crime.

In July 2021, a number of amendments were proposed by MPs – including Conservative David Davis, who has been a vocal critic of the Bill’s measures – that would remove the ability to monitor encrypted communications.

Read more on IT legislation and regulation