Apple to reject demands to use CSAM system for surveillance

Apple defended its new system for scanning iCloud for illegal Child Sexual Abuse Materials (CSAM) on Monday during an ongoing controversy over whether the system reduces the privacy of Apple users and could be used by governments to monitor users. citizens.

Last week, Apple announced that it had begun testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without knowing the content of a user’s photos stored on its servers.

Apple reiterated Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and software that runs on iPhones.

Privacy advocates and technology commentators are concerned that Apple’s new system, which includes software to be installed on people’s iPhones through an iOS update, could be expanded in some countries through new laws to check other types of images, such as photos with political content. of only child pornography.

Apple said in a document posted on its website Sunday that governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images that Apple will distribute to iPhones for enable the system.

“Apple will reject such demands. Apple’s CSAM detection capability is uniquely built to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups, ”Apple said in the document. “We have faced lawsuits to build and implement government-required changes that degrade user privacy before, and we have steadfastly rejected those demands. We will continue to reject them in the future. “

He continued: “Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not comply with any government request to expand it.”

Some cryptographers are concerned on what could happen if a country like China passes a law that says the system must also include politically sensitive images. Apple CEO Tim Cook has previously said that the company follows the laws of every country in which it does business.

Businesses in the U.S. are required to report CSAM to the National Center for Missing & Exploited Children and face fines of up to $ 300,000 when they discover illegal images and fail to report them.

Apple’s reputation for defending privacy has been cultivated for years through its actions and marketing. In 2016, Apple took on the FBI in court to protect the integrity of their encryption systems on the device in the investigation of a mass shooter.


But Apple has also faced significant pressure from law enforcement officials about the possibility of criminals “shutting down” or using privacy and encryption tools to keep messages or other information out of reach. of the forces of order.

The controversy over Apple’s new system, and whether it comes to policing users, threatens Apple’s public reputation for building secure and private devices, which the company has used to enter new markets in personal finance and healthcare.

Critics are concerned that the system works partially on an iPhone, rather than just scanning photos that have been uploaded to the company’s servers. Apple’s competitors typically only scan photos stored on their servers.

“It is truly disappointing that Apple became so obsessed with its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours,” wrote technology commentator Ben Thompson in a newsletter. Monday.

Apple continues to defend its systems as a genuine improvement that protects children and will reduce the amount of CSAM that is created while protecting the privacy of the iPhone user.

Apple said its system is significantly stronger and more private than previous systems for every privacy metric the company tracks and that it did everything it could to build a better system to detect these illegal images.

Unlike current systems, which run in the cloud and cannot be inspected by security researchers, Apple’s system can be inspected through its distribution on iOS, an Apple representative said. By moving some of the processing to the user’s device, the company can gain stronger privacy properties, such as the ability to find CSAM matches without running software on Apple’s servers that verify each photo.

Apple said Monday that its system doesn’t scan private photo libraries that haven’t been uploaded to iCloud.

Apple also confirmed that it will process photos that have already been uploaded to iCloud. The changes will be rolled out via an iPhone update later this year, after which users will be alerted that Apple is beginning to verify iCloud photo stores with a list of fingerprints that correspond to known CSAM. said Apple.

Add Comment