Daniel - stock.adobe.com
Brits say social media must do more to block harmful content
UK citizens want social media companies to do more to prevent harmful content appearing on their platforms
More than two-thirds (68%) of UK adults demand that social media platforms do more to prevent racism, homophobia and misogyny on their platforms, according to a UK government survey.
The survey of more than 1,000 adults also revealed that 38% has seen such content in the past month. The majority (84%) of adults questioned said that they were concerned about the content.
A government bill to regulate social media companies and to protect people from harmful content is currently going through Parliament.
Introduced in March and currently at its report stage in Parliament, the Online Safety Bill sets in law rules about how online platforms should behave to better protect their users. It will introduce criminal sanctions for tech company executives and senior managers, alongside further criminal offences.
Nadine Dorries, digital secretary, said that the survey revealed that people support tighter control of social media.
“It is clear people across the UK are worried about this issue, and as our landmark Online Safety Bill reaches the next crucial stage in Parliament, we’re a big step closer to holding tech giants to account and making the internet safer for everyone in our country.”
The survey found that 78% of respondents want social media companies to be clear about what sort of content is and isn’t allowed on their platform.
Almost half (45%) said they will stop using or will reduce their use of social media if they see no action from social media giants such as Facebook, Twitter and Tiktok.
The Department for Culture Media and Sport (DCMS) said that the safety of women and girls across the country is a top priority: “The measures we’re introducing through the Online Safety Bill will mean tech companies have to tackle illegal content and activity on their services, women will have more control over who can communicate with them and what kind of content they see on major platforms, and they will be better able to report abuse.”
The government said that the new laws will protect children, tackle illegal content and protect free speech, and force social media platforms to uphold their stated terms and conditions.
If they don’t, the regulator Ofcom will work with platforms to ensure they comply and will have the power to fine companies up to 10% of their annual global turnover – which could reach billions of pounds – to force them to fulfil their responsibilities or even block non-compliant sites.
Read more about online safety in the UK
- Firms working on the UK government’s Safety Tech Challenge have suggested that scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain.
- Many of the regulatory bodies overseeing algorithmic systems and the use of data in the UK economy will need to build up their digital skills, capacity and expertise as the influence of artificial intelligence and data increases, MPs have been told.
- Fact-checking experts tell House of Lords inquiry that upcoming Online Safety Bill should force internet companies to provide real-time information on suspected disinformation, and warn against over-reliance on AI-powered algorithms to moderate content.