In the wake of a report that revealed how Apple uses Siri recordings for a quality control process called ‘grading,’ the Cupertino, California based company is suspending the program.
Along with shutting down the grading process worldwide, Apple said it would review the program. Grading was used to help determine if Siri heard queries correctly or if users activated the digital assistant accidentally.
On top of that, Apple plans to issue a future software update to give Siri users a choice to participate in grading.
Concerns stemmed from a damning report in which a contractor at a firm hired by Apple to perform Siri grading revealed how that process works. Essentially, contractors review snippets of Siri recordings and assess if Siri is accurately hearing them or activating by mistake.
Apple claims less than one percent of daily Siri requests get sent for grading, and also says recordings aren’t matched with names or Apple IDs. However, the contractor revealed that the recordings often contain sensitive or identifiable information.
Specifically, the contractor said that recordings include medical information, financial information, people having sex, drug deals and more. Even if Apple anonymizes the source of the recordings, contractors could still identify people in the audio snippets. Worse, a malicious employee could potentially abuse the data.
Opting out of sending Siri data to Apple
The report also highlighted how Apple doesn’t offer a way for users to opt-out. Using humans to assess audio snippets and perform quality controls isn’t new in the digital assistant business: both Amazon and Google do the same thing. However, both companies provide users’ with the choice to not participate in the process.
While Apple said it plans to provide this option, currently the only way to opt-out of Siri grading is to disable Siri altogether — an unnecessarily complicated process.
According to a guide posted by The Verge, you need to navigate to the Settings app, then ‘Siri & Search.’ Then, turn off all the methods for activating Siri. There are two ways: ‘Listen for “Hey Siri”‘ and ‘Press Side Button for Siri.’
Once you turn those off, you’ll get a warning that there’s one more to take to delete your data from Apple’s servers. Go to Settings, then ‘General’ and ‘Keyboard.’ Turn off the Dictation function. You’ll get a final warning that if you ever want to use the features again, you’ll have to re-upload some data to Apple’s servers.
Complexity aside, the problem with this is that to opt-out of data collection, you have to not use the feature entirely. There are ways around this, but if for users who aren’t tech-savvy, it’s more trouble than it’s worth.
It’s good that Apple will add a method to opt-out of data collection for Siri, but it begs the questions why it wasn’t an option to start. Apple routinely boasts about how the iPhone is private — what happens on your iPhone stays on your iPhone. But for a company making privacy the core of its business, it needs to do better.