fbpx
News

Apple contractors hear Siri recordings of drug deals, people having sex: report

Apple Watch and HomePod are the worst offenders for accidental Siri recordings

Despite its emphasis on privacy, Apple’s digital assistant, Siri, relies on human quality control contractors that regularly listen to sensitive information.

In a report from The Guardian, a whistleblower contractor revealed to the publication that Apple contracts out Siri quality control, similar to how Amazon and Google do with Alexa and Assistant. While contractors only listen to less than one percent of daily Siri activations, and often for just a few seconds each, they can hear a variety of sensitive information.

According to The Guardian, contractors have heard medical information, criminal activities and even sexual encounters.

Apple uses the contractors to test and gauge how well its voice assistant fulfills requests. In a statement to Engadget, Apple said it has multiple privacy protections in place to protect users. For example, it doesn’t attach Apple IDs to recordings, contractors study recordings in “secure facilities” and they are bound by “strict confidentiality requirements.” Additionally, Apple doesn’t know whose device made a request, nor can it make other connections.

However, The Guardian’s source said that recordings come with user data showing location, contact details and app data. The information is meant to help verify whether a request was successful or not.

Apple Watch, HomePod biggest culprits in accidental recordings

Further, The Guardian says that accidental recordings produced some of the most sensitive data. The Apple Watch and the HomePod produced the most mistaken recordings. On the watch, this is because users can lift it and begin speaking to activate Siri. On the HomePod, it often activates if someone says something similar to the ‘Hey Siri’ wake phrase.

The whistleblower told The Guardian that Watch recordings can be as long as 30 seconds, which can be enough to give a good sense of what’s going on.

“You can definitely hear a doctor and patient, talking about the medical history of the patient,” they said. “Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal… you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the Pod or the Watch.”

The other concern with Apple’s contractors is who gets access to the data. Reportedly, the contractors have a high turnover rate, and allegedly there is little vetting for new hires. In theory, a malicious employee could abuse the data.

Worse, Apple doesn’t appear to have policies for dealing with sensitive recordings. Instead, it encourages staff to report accidental recordings as technical problems, but not the content of the recordings.

Engadget points out that avoiding these recordings will be difficult as long as Apple and other companies want humans checking the quality of their voice assistants. However, Amazon and Google at least offer some options to opt-out of having recordings used. Siri doesn’t, with the only option being to disable the ‘Hey Siri’ hotword or turn off Siri entirely.

All this doesn’t necessarily mean using Siri compromises your privacy. Instead, it shows that Apple has room to improve in areas, especially as the company that champions the importance of privacy.

Source: The Guardian

Via: Engadget

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments