Stuart Monk - Fotolia

Lords question ‘extensive’ government online safety powers

Digital minister Paul Scully defends government Online Safety plans to give secretary of state powers to direct Ofcom

Safeguards will be written into the Online Safety Bill to ensure the secretary of state does not unduly interfere with the work or independence of online harms regulator Ofcom, digital minister Paul Scully has told a Lords committee.

Under the draft Online Safety Bill, which was first published in May 2021 but has since undergone numerous changes, the secretary of state for the Department of Culture, Media and Sport (DCMS) has the power to direct Ofcom’s regulatory priorities and to modify its codes of practice for tech firms for reasons of public safety, national security and “public policy”.

The proposed powers have attracted criticism from both civil society and lawmakers, who have argued they would undermine Ofcom’s independence as a regulator; allow the secretary to avoid Parliamentary scrutiny; and generally make the process vulnerable to the whims of whatever government is in power.

Responding to such criticisms in July 2022, then-digital minister Nadine Dorries said in a written statement that government recognises concerns about the degree of executive control the powers would allow, and has therefore built in a “number of safeguards” to ensure Ofcom’s independene.

“We will make two substantive changes to this power: first, we will make it clear that this power would only be used ‘in exceptional circumstances’; and secondly, we will replace the ‘public policy’ wording with a more clearly defined list of reasons for which the Secretary of State could issue a direction,” she said.

“This list will comprise national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”

Addressing the Lords Communications and Digital Committee on 25 January 2023 – where the consensus is that such powers are “unnecessarily extensive” – Scully said they were necessary because there are certain instances (such as during an unfolding national security incident) where the government has access to information the regulator simply does not.

Read more about online safety

Orla McRae, deputy director of online harms regulation at DCMS, said the proposed powers will provide flexibility, because the fast-moving nature of the technology sector makes it difficult to predict every future harm or challenge posed by technology, and are a recognition that there are situations where the government has more information and is better placed to deal with them.

“It’s not about interfering with Ofcom’s independence but simply providing checks and balances to ensure the implementation of the regulation is as Parliament intended,” she said.

Asked whether the government considered adopting a more conventional approach to its relationship with Ofcom (for example, by limiting the powers to clear emergency situations or engaging in open and transparent dialogue with the regulator over specific concerns), Scully said there are certain situations “like national security, where that may not be an appropriate approach”, adding that Parliament would still be able to have its say.

“We want to make sure that … the reason for the modification has to be laid before Parliament to make sure it is as transparent as we can possibly make it, and that we can only use the power at the point that the code is submitted to be laid in Parliament, rather than any time in the process,” he said, reiterating that the powers would only be used under “exceptional circumstances”.

However, multiple Lords questioned the necessity of the government’s proposed powers, and challenged its definition of what constitutes an exceptional circumstance, particularly the secretary of state’s ability to direct Ofcom for “economic policy and burden to business” reasons.

Lord Lipsey, for example, expressed concerns that, in practice, a secretary of state could be influenced behind closed doors by their associates in the private sector to change Ofcom’s codes in a way that benefits their business.

“Parliament, which is supposed to govern this nation, doesn’t get a look-in – it’s entirely private business between Ofcom and the secretary of state,” he said, adding that the proposal “cannot possibly survive” because many in Parliament will simply vote against it. “We won’t accept this,” said Lipsey. “It isn’t a runner … you won’t get this through as it is.”

Open consultation process

In response, McRae said it was important to note that directing Ofcom to modify its codes of practice would come at the end of an extensive open consultation process, “including with a specified list of external people”, meaning everyone will be aware of what was in the draft codes before the secretary of state can get involved.

“Once that direction is issued, there is a process of transparency to ensure that it is clear the reasons for that direction and clear what Ofcom has done in response to it,” she said. “And there is the provision that prohibits secretaries of state from requiring specific measures be inserted into the code.”

Lord Lipsey further added that because the secretary of state gets to decide what constitutes an exceptional circumstance, important decisions will be made before they ever see the light of day in Parliament. “If Parliament … ever wants to question whether it’s exceptional, yes, we have a say then, but by then it’s water under the bridge,” he said.

Both McRae and Scully said government would listen to the concerns about the powers before tabling amendments to formally define “exceptional circumstances”, but that it is still the intention to press forward with changes in line with what Dorries set out in her written statement in July 2022.

On 17 January 2023, digital minister Michelle Donelan announced new amendments to the Bill, one introducing criminal liability for tech bosses over failures to protect children online, and another linking existing immigration offences to the Bill’s list of “priority offences” – those which represent the most serious and prevalent illegal content or activity online, and which tech firms will be obliged to proactively prevent people from being exposed to.

The latter amendment means technology companies could be forced to remove videos of people crossing the English Channel “which show that activity in a positive light”.

Donelan confirmed in a statement that “the result of this amendment would therefore be that platforms would have to proactively remove that content” related to English Channel crossings.

Read more on IT legislation and regulation