Apple’s Siri is Sending Recordings to Human Reviewers for Analysis

  • Apple Siri is listening to you and sends the recordings to human reviewers without you knowing it.
  • The reviewers reveal that it’s relatively easy to figure out the identity of the person behind a recording.
  • Apple denies this, saying that everything is encrypted and that the reviewed recordings last for a few seconds.

Only two weeks after the exposure of Google, and the news about its Assistant silently recording to what we’re saying and pushing it to language reviewers, there’s a similar story that’s surfacing for Apple’s Siri. This time, it is the Guardian who makes the revelation, claiming that the associated information comes from unnamed Apple contractors who are working on the project of reviewing Siri’s recordings. As they point out, they have heard couples having sex, getting involved in drug deals, and visiting the doctor’s office to receive an evaluation.

All of the above are samples of extremely-sensitive information that iPhone owners would never think of sharing with someone they don’t know, but allegedly, this is exactly what is going on. According to the reporters, not everything that is recorded by Siri goes to Apple’s contractors, but only a small portion of it. The purpose of the reviews is to figure out the false positives in the activation of Siri, so it’s something like a quality control aiming to improve the voice assistant’s responding mechanism. Moreover, these recordings are not associated with the user’s Apple ID, so the reviewers don’t know who is it coming from.

However, if the recordings contain mentions of names or addresses, then the users’ privacy is undermined, and the whole thing seems to be directly against Apple’s supposedly strict privacy policies. As the whistleblowers told the Guardian, these recordings are accompanied by the location data, contact details, and app data. Now, that’s more than enough to make the compromised person feel uncomfortable. As reported, the Apple Watch features very high rates of false triggering, sending 30-second sound recordings that are often very unveiling. Finally, the reporters claim that Apple isn’t strict enough as to who works in the reviewing firms, who gets to listen to all the recordings and what they choose to do with it later on (scamming, phishing, or extorting). As they say, there’s absolutely no oversight.

Apple responded to all this saying that the false triggering accounts for a tiny portion of Siri’s activations, and that only 1% of them are sent to the recording reviewers for analysis. Moreover, they add that all reviewers have signed Apple’s strict confidentially agreements so they can’t indulge in malicious activities. Finally, they claim that your name, contacts, the music you listen to and searches you do are reaching the reviewing center encrypted, so there’s no way for someone to identify you.

Do you feel comfortable with this level of the pervasiveness of Apple in your private life, and do you trust the company in general? Let us know in the comments section beneath this post, or join the discussion on our socials, on Facebook and Twitter.

REVIEW OVERVIEW

Latest

Pinelands Regional School District Announced Data Breach

Pinelands Regional School District concluded an investigation about a data breach they had in March this year.The breach happened using then board...

Banking Trojan Targets 100 Organizations in Brazil

A banking trojan from Latin America was found targeting almost 100 Brazilian organizations and individuals.The malware was first noticed in late August...

The Number of Phishing Emails Impersonating Craigslist Is Growing

Craigslist Gsuite & Microsoft users are being targeted with phishing emails that present a fake user login page.These emails rely on brand...
For a better user experience we recommend using a more modern browser. We support the latest version of the following browsers: For a better user experience we recommend using the latest version of the following browsers: Chrome, Edge, Firefox, Safari