Apple abandons “CSAM” and will not search iCloud photos

Apple attracted the wrath of the whole world last year by presenting its “CSAM” plan. A great defender of user confidentiality and privacy, the Apple brand turned its back on these principles to launch a system for digitizing photos on iCloud.

The objective announced by Apple was to fight against child pornography and the propagation of these images using the Apple Cloud tool. After meetingstrong resistance from public opinion, the apple brand had “paused” its project.

Apple backtracks

Today Apple announces that it has completely abandoned its CSAM project. The Cupertino company even plans to do the opposite and allow its users to encrypt photos on iCloud so that they are not visible.

This functionality should be one of the new features present iniOS 16.2 update for iPhone. The latter will probably be released during the month of December. Apple has not yet given a specific date. In addition to photos, Apple explains that this new option will allow users to encrypt other data such as notes or device backups.

Apple, however, is not turning its back on child protection. In a statement made to the Wired media, the Apple brand ensures that it is deepening its investment in this area. Today it is possible for an iPhone to detect when a child (or a user designated as such) receives or sends sexually charged photos.

Protecting children and privacy

This solution present in Apple's native "Messages" application allows you to notify the user of content received or sent. This process is entirely local and no one knows what a child uses their phone for. His privacy is preserved, but Apple filters the messages received in a caring manner.

Statisa ensures that in France in 2018, 23% of 11-14 year olds have already encountered pornographic images against their will. Still according to this study, only 50% of children have never encountered offensive content when using digital devices.

i-nfo.fr - Official iPhon.fr app

By : Keleops AG