Quite discreetly and apparently between December 10 and 13,Apple has removed any mention of the CSAM detection feature from its siteannounced earlier this year. This had, as we recall, sparked controversy because it consisted -basically- of reading the contents of your iPhone in search of child pornography images.
In an interview with the American magazineThe Verge, it appears, however, that Cupertino does not really want to put an end to this initiative but that it is simplypostponedas is particularly the case with Universal Control on macOS Monterey or evenID cards scanned on iOS 15. In any case, this is what Shane Bauer, an Apple employee interviewed by the newspaper, says.
There is still work
In fact, it seems that Apple has become aware of the collective criticism generated by its announcement, which some without hesitation compare to a realprivacy riskof their personal data. Concerns from elsewhererecently confirmed by the no less seriousNew York Times, which ensures that Europe is already considering developing a similar program whose objective would be to monitor its citizens under the guise of the fight against terrorism.
It is impossible to know precisely when iCloud image scanning will be deployed, but with 5 GB of storage offered each user potentially only has a few months to sort their photo library in order to avoid any danger. As for those who opted for a paid plan with even more data, we just have to wish them good luck.
The deviations do exist
If the very principle of the child protection solution is certainly laudable, this is what people can do with it.institutional actorswhich poses a problem. Cupertino is also completely transparent on the subject and regularly publishes figures detailing state requests to unlock suspicious iPhones.
However, as has been shownmisuse of Pegasus spyware, it is enough for officials to choose to focus their strategy on other channels (identification of activists and the opposition, etc.) for iOS to indirectly and in the blink of an eye abandon its famous promise of constant security renewed by its publisher.
*child sex material abuse= child pornography files
i-nfo.fr - Official iPhon.fr app
By : Keleops AG