CSAM: Apple finally explains the cessation of iCloud photo monitoring

  • At the end of 2022, the apple company definitively abandoned CSAM
  • Until now, she had never admitted the potential abuses that this project could have caused.
  • Today, Apple admits

In August 2021, the Cupertino company announced for the first time the implementation of a measure to combat child crime. Following this announcement, Apple received a storm of criticism from all sides,including its own employees.

And for good reason, the apple giant has always affirmed that the security and confidentiality of its users are a priority. To prove its point, the company hadalready refused to cooperate with authorities in certain cases. We remember in particular the time whenApple engineers refused to help the FBI unlock a terrorist's iPhone.

However, the measure introduced at the time to detect child pornography content involvedscan the photos available on its customers' iCloud. The purpose of this analysis was to compare the photos of its users with those present in the database of missing or exploited children.

Fault confessed half forgiven

Following Apple's announcement regarding its intention to analyze the private photos of its customers, many voices were raised. And not just any ones. Among the lot, we found in particular the NGO defending freedoms Electronic Frontier Foundation, the head of WhatsApp Will Cathcart, and the whistleblower Edward Snowden.

The latter had also published a murderous tweet at the time against Apple

pic.twitter.com/yN9DcTsBNT

— Edward Snowden (@Snowden)August 6, 2021

Apart from a few speeches to counter the arguments put forward by the various opposition voices, particularly regarding false positives. Apple has never actually admittedthe possible misuse of this functionalityby authoritarian governments. Which was probably a mistake given that pressure continued from all sides. Until forcingthe Cupertino company to definitively end its project at the end of 2022.

Two years to admit

After more than 24 months of contradicting opponents, Apple finally changed its position last week. And it was Erik Neuenschwander, director of user privacy and child safety at Apple who had the difficult task of saying it publicly:

“This would result in […] a slippery slope with unintended consequences. Searching for a type of content, for example, opens the door to massive surveillance […]”

So, it's never too late to admit your mistakes.

i-nfo.fr - Official iPhon.fr app

By : Keleops AG