Child protection and CSAM: Apple stops the project

In a press release sent to AppleInsider, Apple made a significant announcement. The Californian firm has in fact declaredpause the deployment of its child protection toolsannounced a few weeks ago.

As detailed here, the American giant planned the launch, with iOS 15, of several new features to protect children from sexual content, but also to denounce any iCloud account sharing child pornography content. It was especially this point that made people cringe. Because even if the analysis system underlying this detection appeared until then on paper to be quite secure in terms of respecting confidentiality, it was a source of great fear for many. The tension was palpable, both among analysts andIT security specialists, that for the defenders of private life,dont Edward Snowden. She eveninvaded the ranks of the Californian firm directly.

But they will be able to take a breather, therefore, since Apple haspaused the project until further notice. The firm explains to AppleInsider that it wants to take time to analyze the situation and the tools, including CSAM (iCloud image analysis) which is so debated.

What do you think of this decision?

Also read:

i-nfo.fr - Official iPhon.fr app

By : Keleops AG

Editor-in-chief for iPhon.fr. Pierre is like Indiana Jones, looking for the lost iOS trick. Also a long-time Mac user, Apple devices hold no secrets for him. Contact: pierre[a]iphon.fr.