Siri too…

After Google and Amazon, it's Apple's turn to make headlines about its voice assistant and respect for privacy.A source from the journal The Guardianhas revealed a lot of information about its work handling certain Siri requests.

Siri, not so different from Google Assistant?

The informant specifies that he works for an Apple subcontractor company. Its task is to listen to certain audio recordings captured by Siri from users, analyze them and rate them for the purposes of improving the technology underlying Apple's AI. There are several such companies around the world, working towards the same goal.

And to evaluate the voice assistant, according to the source, employees of these companies must also report any activation errors, or false positives. There would be a lot of such requests to process, with the majority coming from the Apple Watch and HomePod. The Apple Watch would be particularly prone to false positives, because to activate Siri on the latter, it is possible to simply hold it close to your mouth and perform the request.

Thus, accidental activations would regularly be heard by the person source of the revelations and their colleagues. They would thus have to listen to audio recordings of illegal drug transactions, sexual acts, or discussions between doctor and patient, among others.

In addition, according to The Guardian and its informant, the recordings are accompanied by various information, such as geographic positioning as well as details on the contacts and apps mentioned.

© Apple

Apple's responses

Unlike Amazon and Google, this type of revelation takes on another dimension when it comes to Apple and Siri. Because the Cupertino company has been a defender of privacy for several years. As evidenced by one of its famous slogans “what happens on your iPhone, stays on your iPhone”. The Californian company, in order to improve its image, therefore wasted no time in responding to The Guardian article. She then issued several clarifications.

For one thing, she says, human-audible Siri recordings make up only a tiny portion of all the queries made on millions of Apple devices, less than 1%. She adds that at no time is it possible to link the audio recording and the user's Apple ID. And then she ends up specifying that users are free to deactivate geolocation for Siri in the settings, to prevent their Siri requests from being able to be geographically located.

Finally, Apple never seems to refute the various points in the Guardian article, which seems quite worrying. Because this is how Siri ultimately functions quite similar to that of competing voice assistants, particularly in terms of privacy.

i-nfo.fr - Official iPhon.fr app

By : Keleops AG

Editor-in-chief for iPhon.fr. Pierre is like Indiana Jones, looking for the lost iOS trick. Also a long-time Mac user, Apple devices hold no secrets for him. Contact: pierre[a]iphon.fr.