Scanning iCloud photos is already causing problems

With iOS 15, Apple introduced two new features for its Photos app.The first allows us to better identify faces, in order to group your friends' photos into automatic folders. An option that cannot be deactivated, and which seriously questions all those who fear for their confidentiality. Indeed, there are countless cases where facial recognition is used for malicious purposes, sometimes even at the expense of the developer.

The other change is for children. Apple has thus imagined a solution similar to that of antiviruses, wherea robot analyzes images saved on an iPhone in order to detect potential files already reported for their pedophilic nature. In this way, criminals can then be reported to justice after manual validation. For the moment, however, it is necessary to have activated the Photos app to be eligible for this update, which runs in the background.

No matter how well-intentioned,@Appleis rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—*without asking.*https://t.co/wIMWijIjJk

— Edward Snowden (@Snowden)August 6, 2021

Potential deviations

As is often the case with such a sensitive subject, the first voices were loudly raised as soon as these improvements were announced. The cryptographerMatthew Green, For example,to whom we already owe serious criticism following the Pegasus affair, believes that this new practice risks first of all increasing the number of false positives (adolescents lying about their age, health professionals, parents, etc.). But for him, the worst is mainly found on the side of Apple's ambitions.

In fact, the Cupertino company goes much further since it implements its risky image analysis system directly in its Messages platform. However, it is precisely renowned for its ability to send and receive end-to-end encrypted text messages… How can we therefore guarantee a real guarantee of security for Internet users with such a paradox?

These are bad things. I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.

— Matthew Green (@matthew_d_green)August 5, 2021

Governments in the crosshairs

At this time, it is unclear whether Apple plans to sell its technology to other potentially malicious actors, although this is unlikely. It would nevertheless be more than worrying to note that specialized companies - such as the NSO Group - are taking inspiration from it to duplicate the solution in order to market it to states already singled out for their authoritarian policies. Especially at a time when the defense of personal data has become a major marketing argument for the Apple brand.

i-nfo.fr - Official iPhon.fr app

By : Keleops AG