fbpx
News

Google, Amazon make changes after digital assistant human review controversy

Google temporarily halted human reviews of recordings in the E.U. and Amazon offered new opt-out options

Amazon Alexa and Google Assistant apps

Google and Amazon are following Apple and adjusting how both companies handle human reviews of voice assistant quality.

Amazon and Google use humans to review short recordings from the digital assistant, smart speaker and display products offered by both companies. However, when a contractor who worked at a firm hired by Apple to do the same with its Siri platform revealed the potential dangers of the practice, all three companies made moves to improve privacy policies or stop the practice entirely.

In Apple’s case, the company stressed that less than one percent of daily Siri activations were sent for ‘grading’ — what the company calls its human review process. It’s meant to assess if Siri activations were accurate and answers were correct. Despite Apple’s efforts to anonymize where the recordings came from, contractors often heard sensitive and identifying information, especially when users invoked Siri by accident. Contractors had heard medical and financial information, people having sex, drug deals and more.

The Cupertino-based company responded to concerns by halting the grading process while it performed a review. Additionally, Apple said it would add the ability to opt-out of grading for users in a future update. Considering the only way to stop recordings from going to Apple — outside of a complicated workaround — was to disable Siri on an iOS device altogether, the new controls will make a welcome addition.

Google and Amazon follow suit

Shortly after, Google agreed to stop listening to and transcribing Google Assistant recordings for three months in Europe.

The news came via German regulators after the country’s data protection commissioner said the country would investigate reports that contractors listened to audio captured by Google Assistant.

Amazon has also decided to deal with the issue of human reviews, but instead of stopping the process, the Seattle, Washington-based company decided to improve its policies and opt-out settings.

Amazon’s Alexa platform already provided a more transparent set of privacy policies than either Google or Apple. Now, the company has updated the language on its privacy portal and clarified what different settings would do.

Alexa users can open the smartphone app and in the settings menu open ‘Alexa Privacy’ and ‘Manage How Your Data Improves Alexa’ to access the portal. It now says that “your voice recordings may be used to develop new features and manually reviewed to help improve our services.” Alternatively, users can navigate to in their web browser to access the settings.

The most significant change here is that the previous setting stopped recordings being used to develop new features. Now, checking the box means recordings won’t be reviewed by humans at all.

However, the recordings are still uploaded to Amazon’s servers. Users can delete these whenever they want from their Alexa settings. The new privacy options don’t provide a setting to turn off storage of your recordings.

While each company has taken a different approach to handling the issue, hopefully, each comes to the same result: clearly defined and transparent explanations of how and why data is collected and options to not have data collected if you don’t want your voice stored by these companies.

Source: The Verge, (2)

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments