After Apple’s announcement of its analysis of photos, Will Cathcart, CEO of WhatsApp, assured that it is a risk for people’s privacy
Last week Manzana announced a new photo analysis system with which he seeks to combat the child abuse. Through its tool you can report users who have images that are classified as pornographic.
Although the company reported that it is safe, its tool has caused great controversy as people from different sectors are concerned about the Privacy of the users.
WhatsApp speaks out against Apple
Such is the case of WhatsApp who through Twitter warned about the risks that the Apple photo analysis.
“I think it is the wrong approach and a setback for the privacy of people around the world. People have asked if we will adopt this system for WhatsApp. The answer is no'”
The Will cathcart, CEO of WhatsApp, stressed that what Manzana is doing is invade privacy of its users.
Instead of focusing on making it easy for people to report on the content that is shared with them. Apple has built ‘software’ that can scan all the private photos on your phone: even photos that you haven’t shared with anyone. It is not privacy »
In various messages Will cathcart, CEO of WhatsAppHe said he is concerned about the plans of the Cupertino company since it could be used for a control of the government or any other company.
“A surveillance system built and operated by Apple could very easily be used to scan private content for anything they or a government decide they want to monitor.”
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
– Matthew Green (@matthew_d_green) August 4, 2021
The WhatsApp CEO he stressed that the analysis of photos could be exploited in “China or other countries, or abused by spyware companies.”
Your publication in Twitter generated great controversy on social networks, because in recent months WhatsApp has been criticized for sharing your information with Facebook.
Apple responds to WhatsApp
After the message of Will cathcart, Manzana responded to their questions by noting that users will have the option at all times to disable iCloud.
He also insisted that the system is only analyzing according to the data of “known” images. Provided by the National Center for Missing & Exploited Children and other organizations.
In other words, your photo analysis could not locate other types of images.