New enemies of CSAM join forces in an international coalition

This Thursday, August 19, an open letter urging Apple to abandon its plan to“create surveillance capabilities in iPhones, iPads and other products”was published by an “international coalition”. The latter is made up of nearly 9 groups including associations like EFF.

As specified in the letter, CSAM tools could be used by authoritarian governments to censor certain speech. “We fear they could be used to censor protected speech, threaten the privacy and security of people around the world, and lead to disastrous consequences for many children.”

The authors of the letter go on to explain that “Governments could force Apple to detect objectionable images for reasons other than sexually explicit.”

The iMessage system also called into question

The letter also calls on Apple to abandon planned changes to iMessage. These should make it possible to identify and blur nudity in children's messages, allowing them to see it only if parents are informed. The signatories say in response that this functionality could endanger children from intolerant homes, but also those looking for educational material. Even more serious, this feature would simply break end-to-end encryption, one of the basic security principles of iMessage.

Apple's plan to detect CSAM images stored in iCloud Photos has been particularly controversial and has sparked concerns from security researchers, academics, privacy groups...

Everyone believes that this new option could be misused by governments as a form of mass surveillance. The Cupertino company tried to address concerns by publishing an FAQ on the subject and assured that it would refuse all requests to expand its system.

But as pointed outReutersIt's hard to imagine Apple deciding to leave a country like China or India if the firm faces a court order forcing it to change its CSAM tools or leave.

i-nfo.fr - Official iPhon.fr app

By : Keleops AG