CSAM is this new system in development at Apple and which is making a lot of noise at the moment, since its aim is to detect child pornography content among the photos of iCloud Photos users. An FAQ has also been published by the Californian firm to try to calm the ardor of privacy defenders,detailed FAQ here.
To try to better understand to users how Apple intends to both respect the private nature of photos and identify illegal activity of viewing child pornography content, Erik Neuenschwander, head of privacy at Apple, or Head of Apple Privacy, responded to several questions from TechCrunch. Here are the important points from the interview, in our opinion.
Device-side analysis, unlike other cloud services
Erik Neuenschwander assures us: Apple's system isdifferent from other Cloud systemsfor searching child pornography content, since the basic analysis is done on the user's machine and not in the cloud. It is a technological innovation which, again according to the Apple manager, makes it possible to search for illegal images while maintaining the confidentiality of user data.
How does it work?
The system actually creates an identifier for a sensitive photo, an identifier compared with those of photos in recognized child pornography image databases, such as that of the Center for Missing and Exploited Children.
Erik Neuenschwander assures us, the system can never access the data on the iPhone or even the user's entire iCloud photo library.
© AppleIn addition, from the moment a certain number of sensitive images have been identified as identical to images in the databases used for comparison, the alert is given. There is therefore a certain only one to surpass. This allows, among other things, to serve as a first limitation on the appearance of false positives. But this threshold also makes it impossible to search for a single image within an iCloud Photos account. Finally, it prevents the search for images other than those initially targeted to combat child pornography.
CSAM exclusive to iCloud and the USA
Initially, the system will be exclusive to US users and will only affect photos uploaded to iCloud.As reported by MacRumors, it would not be impossible that a deployment abroad would be necessary at a later stage. But this will depend on each country targeted and the strictness of its legalization in terms of confidentiality and child pornography.
Image detection will in any case be started after the release of the new OS iOS 15, macOS Monterey and iPadOS 15. Remember that Apple is not the first tech player to implement such a system to fight against child pornography photography. Google, via the CyberTipline tool,can be alerted if child pornography photos are shared by a Gmail user.
What do you think of this new CSAM system from Apple for detecting child pornography content on iCloud?
i-nfo.fr - Official iPhon.fr app
By : Keleops AG
Editor-in-chief for iPhon.fr. Pierre is like Indiana Jones, looking for the lost iOS trick. Also a long-time Mac user, Apple devices hold no secrets for him. Contact: pierre[a]iphon.fr.