Urupong - stock.adobe.com

Why we need to reset the debate on end-to-end encryption to protect children

Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent

Last week, the National Society for the Prevention of Cruelty to Children (NSPCC) released a report in a bid to raise understanding of the impact of end-to-end encryption (E2EE) on children’s safety from online sexual abuse.

It aimed to reset the debate that has framed children’s safety against the privacy of users, with heated arguments doing little to shine a light on a solution that works in both these important interests.

We will always unapologetically campaign for children to be recognised in this debate and to ensure that their safety and privacy rights are considered when platforms roll out E2EE. Children are one in five UK internet users – it’s legitimate they have a voice in the decisions that affect them.  

It’s necessary because private messaging is the frontline of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent.

While E2EE comes with privacy benefits, there is one group of users whose privacy rights are put at risk – children who have suffered or are at risk of sexual abuse.

These children have the right to have images of their abuse removed by tech firms if they are shared on their platforms. They have the right not to be contacted by offenders who recognise their profiles from these pictures and videos. And they have the right to a safe online environment that minimises the chance of them being groomed to create these images in the first place.

Most major tech firms use tools to detect child sexual abuse images and grooming on their platforms, such as Microsoft’s PhotoDNA. This allows child abuse images to be rapidly identified and removed if users upload them – including in private messaging.

PhotoDNA technology scans an image only to determine whether it includes child abuse and is no more intrusive than the use of spam filters, while machine learning is also used in a proportionate way to identify new child abuse images and grooming.

The rise in self-generated images, where children share images themselves often following grooming and coercion, make this technology crucial to tackle abuse at an early stage, and ultimately protect young users.

At the NSPCC, we have been clear from the start that we are not against E2EE. However, we do believe tech firms have a duty to protect all users and should only roll it out when they can guarantee these technological safeguards are not rendered useless.

The response to our report shows exactly why this debate needs to be reset, with absolutist arguments around privacy leading to accusations that are often confused or inaccurate.

One of these accusations is that we are calling for backdoor access to E2EE messages by law enforcement, which we are not.

While it is important law enforcement can build evidence to prosecute child abuse, too often this debate emphasises only the investigation of abuse after it has taken place.

Social networks currently play a vital role in protecting children from abuse and we are much more concerned about platforms’ responsibility to tackle abuse at an early stage, to ensure the firms themselves can continue to detect child abuse upstream.

This is why we want to see tech firms invest in finding engineering solutions that will give tools similar to those currently used to detect abuse the ability to work in E2EE environments.

Cyber security experts are clear that it should be possible if tech firms commit their engineering time to develop a range of solutions including “on device” and other technical mitigations.

Our polling suggests the UK public does not subscribe to the either-or argument of privacy versus children’s safety and that support for E2EE would almost double if platforms could demonstrate children’s safety would not be compromised.

Yet as long as this debate continues to be framed as a zero-sum issue, no one’s interests will be well served – and decisions could be taken that reinforce unhelpfully polarised viewpoints. 

It is in the interest of everyone engaged in this debate to achieve a balanced settlement for E2EE that protects the privacy and safety of all internet users, including children.

This must balance the range of fundamental rights at stake – recognising this is both a societal and technological issue.

This may be dismissed as mere rhetoric, but in terms of such an incredibly complex issue, it’s the truth.  

Read more about end-to-end encryption

Read more on Privacy and data protection