Apple to inform law on child sexual abuse images on iCloud

Apple will inform authorities of child exploitation images uploaded to iCloud in the United States, the company said Thursday.

The new system will detect images called Child Sexual Abuse Material (CSAM) through a process called hashing, where the images are transformed into unique numbers that correspond to that image.

Apple began testing the system on Thursday, but most US iPhone users won’t be a part of it until an update to iOS 15 later this year, Apple said.

The move brings Apple in line with other cloud services that already scan user files, often using hashing systems, for content that violates their terms of service, including images of child exploitation.

It also represents a test for Apple, which says its system is more private to users than previous approaches to removing illegal child sexual abuse images, because it uses sophisticated cryptography on both Apple’s servers and users’ devices and does not scan real images. just hashes.

But many privacy-sensitive users still reject software that notifies governments of content on a device or in the cloud, and may react negatively to this announcement, especially since Apple has been a strong advocate for device encryption and operates in countries with less voice protections than us

Law enforcement officials around the world have also lobbied Apple to weaken the encryption of iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy.

Before an image is stored in Apple’s iCloud, Apple compares the image’s hash against a hash database provided by the National Center for Missing and Exploited Children (NCMEC). That database will be distributed in iOS code starting with an update to iOS 15. The pairing process is done on the user’s iPhone, not in the cloud, Apple said.

If Apple detects a certain number of offending files in an iCloud account, the system will upload a file that will allow Apple to decrypt and view the images from that account. A person will manually review the images to confirm whether they match or not.

Apple will only be able to review images that match content that is already known and reported to these databases; for example, you will not be able to detect photos of your children’s parents in the bathroom, as these images will not be part of the NCMEC database.

If the person doing the manual review concludes that the system did not make a mistake, Apple will disable the user’s iCloud account and submit a report to NCMEC or notify the police if necessary. Users can file an appeal with Apple if they believe their account was flagged in error, an Apple representative said.

The system only works with images uploaded to iCloud, which users can turn off, Apple said. Photos or other images on a device that have not been uploaded to Apple’s servers will not be part of the system.

Some security researchers have raised concerns that this technology could eventually be used to identify other types of images, such as photos of a political protest. Apple said its system is built to only work and can only work with images cataloged by NCMEC or other child safety organizations, and that the way it builds crypto prevents it from being used for other purposes.

Apple cannot add additional hashes to the database, he said. Apple said it is presenting its system to crypto experts to certify that it can detect illegal child exploitation images without compromising user privacy.

Apple unveiled the feature Thursday along with other features aimed at protecting children from predators. In a separate feature, Apple will use machine learning on the iPhone of a child with a family account to blur images that may contain nudity, and parents can choose to be alerted when a child under the age of 13 receives sexual content in iMessage. . Apple also updated Siri with information on how to report child exploitation.

Add Comment