Getty Images/Hero Images

Apple unveils plans to scan US iPhones for child sex abuse images

Apple will introduce child sexual abuse material detection for US users later this year, but some experts are worried that the technology could be repurposed to scan phones for other kinds of content

Apple will begin scanning its US customers’ devices for known child sexual abuse material (CSAM) later this year, but already faces resistance from privacy and security advocates.

The CSAM detection tool is one of three new child safety measures being introduced by Apple, including monitoring children’s communications with machine learning for signs of nudity or other sexually explicit content, as well as updating Search and Siri to intervene when users make CSAM-related queries.

In its announcement, Apple said the new detection tool will enable the company to report instances of CSAM to the National Center for Missing and Exploited Children (NCMEC), which works in collaboration with law enforcement across the US.

Apple said that instead of scanning images in the cloud, the system would perform on-device matching against a database of known CSAM image hashes provided by NCMEC and other child safety organisations, and it would transform this database into an “unreadable set of hashes” to be securely stored on users’ devices.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” said the company. “This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.

“The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

If there is a strong enough match between a scanned photo and a known image of child abuse, Apple said it would manually check each report to confirm the match, before disabling the user’s account and notifying NCMEC.

“This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM,” it said. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”

John Clark, president and chief executive of NCMEC, said Apple’s expanded protections for children would be a “game-changer,” adding: “With so many people using Apple products, these new safety measures have life-saving potential for children.”

Although the new feature will initially be used to perform scanning for cloud-stored photos from the device-side, some security and privacy experts are concerned about how the technology could be used or repurposed.

Matthew Green, a cryptography researcher at Johns Hopkins University, Tweeted: “Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems. The ability to add scanning systems like this to E2E [end-to-end] messaging systems has been a major ‘ask’ by law enforcement the world over.”

He added: “The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t ‘hurt’ anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.”

The Electronic Frontier Foundation (EFF) shared similar sentiments, saying: “Apple is planning to build a backdoor into its data storage system and its messaging system. But that choice will come at a high price for overall user privacy.

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out and narrowly scoped backdoor is still a backdoor.”

EFF added that, at the end of the day, the CSAM detection tool means all photos in a device would have to be scanned, thereby diminishing privacy.

It also said that in relation to the monitoring of children’s communications for nudity or other sexually explicit content, Apple is opening the door to broader abuses, because all it would take is an expansion of the machine learning’s parameters or a tweak of the configuration flags to look for other types of content.

“That’s not a slippery slope – that’s a fully built system just waiting for external pressure to make the slightest change,” said EFF.

Read more about end-to-end encryption and CSAM detection

  • Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent.
  • Proposals by European Commission to search for illegal material could mean the end of private messaging and emails.
  • Concerned parents have sent hundreds of stuffed animals to the home of Amazon CEO Jeff Bezos in protest against the e-commerce and cloud computing giant’s alleged failure to report child sexual abuse material.

Adam Leon Smith, chairman of BCS, the Chartered Institute for IT’s software testing group, said that although Apple’s measures seem a good idea on the surface as they maintain privacy while detecting exploitation, it is impossible to build such a system that only works for child abuse images.

“It is easy to envisage Apple being forced to use the same technology to detect political memes or text messages,” said Smith.

“Fundamentally, this breaks the promise of end-to-end encryption, which is exactly what many governments want – except for their own messages, of course.

“It also will not be very difficult to create false positives. Imagine if someone sends you a seemingly innocuous image on the internet that ends up being downloaded and reviewed by Apple and flagged as child abuse. That’s not going to be a pleasant experience.

“As technology providers continue to degrade encryption for the masses, criminals and people with legitimately sensitive content will just stop using their services. It is trivial to encrypt your own data without relying on Apple, Google and other big technology providers.”

Others have also warned that although they agree that preventing the spread of CSAM is a good thing, the technologies being introduced could be repurposed by governments down the line for more nefarious purposes.

Chris Hauk, a consumer privacy champion at Pixel Privacy, said: “Such technology could be abused if placed in government hands, leading to its use to detect images containing other types of content, such as photos taken at demonstrations and other types of gathering. This could lead to the government clamping down on users’ freedom of expression and used to suppress ‘unapproved’ opinions and activism.”

However, Paul Bischoff, a privacy advocate at Comparitech, took a different view, arguing that while there are privacy implications, Apple’s approach balances privacy with child safety.

“The hashing system allows Apple to scan a user’s device for any images matching those in a database of known child abuse materials,” he said. “It can do this without actually viewing or storing the user’s photos, which maintains their privacy except when a violating photo is found on the device.

“The hashing process takes a photo and encrypts it to create a unique string of numbers and digits, called a hash. Apple has hashed all the photos in the law enforcement child abuse database. On users’ iPhones and iPads, that same hashing process is applied to photos stored on the device. If any of the resulting hashes match, then Apple knows the device contains child pornography.”

But Bischoff said there are still dangers, and that the technology’s use must be “strictly limited in scope to protecting children” and not used to scan users’ devices for other photos.

“If authorities are searching for someone who posted a specific photo on social media, for example, Apple could conceivably scan all iPhone users’ photos for that specific image,” he added.

Read more on Smartphone technology