Apple and Google recently revealed that they have temporarily stopped letting contractors listen to recordings from their voice assistants. The announcements come after reports that some of these hired handsheard Siri usersdoing things like having sex or discussing private medical information. Amazon has not announced a pause in its process of letting humans listen to Alexa recordings, and that’s not a big surprise. Humans are bound to be a part of this process.
We’ve known for months that third-party contractors have been listening to voice assistant recordings. Bloombergfirst reportedon how Amazon employees thousands of humans worldwide to transcribe and review Alexa recordings in order to improve the technology. The same report revealed that Apple and Google had similar teams doing similar things. We learned more details about those companies’ human review process after a Google contractorleaked scores of Assistant recordingsto the press, and a similar process happens with Apple’s Siri recordings we learned last week.
So props to Apple and Google for responding to the collective outrage over countless violations of their users’ privacy, right? Not so fast. Apple has so far taken the strongest position bytemporarily pausinghuman review of all Siri recordings worldwide. “While we conduct a thorough review, we are suspending Siri grading globally,” an Apple spokesperson said in a statement. “Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
It’s worth highlighting that this opt-in strategy is very unique to Apple and its approach to voice assistants. Google and Amazon have historically made heavy data collection and transcription review the default for Assistant and Alexa. To opt-out, users have had to go throughan annoying, if not downright difficult, processof digging through privacy settings. And that also required users to realize that their data was being collected and that humans might be reviewing their voice assistant recordings since Google and Amazon aren’t exactly transparent about what happens to all the things Assistant and Alexa records and stores.
It’s also essential to highlight how imperfect these technologies still are. Voice assistants aren’t supposed to record users without the user intentionally initiating recording through a wake word. However, anyone who’s used a voice assistant knows that it’s not uncommon for the computer to screw up and think it should be listening. This is how Siri accidentally recorded a couple having sex. The human review process is actually supposed to improve these technologies so that they screw up less. However, Apple, Google, and Amazon kept secret the possibility that a human could listen to users’ recordings until whistleblowers spoke out about the practice earlier this year. All of these companies say that only a very small number of recordings are reviewed by humans, but even the slightest chance that some random stranger will hear you having sex is unnerving, to say the least.
There’s so far no strong indication that humans reviewing voice assistant recordings will cease permanently. This week, Google also said itpausedits human review process, but it hasn’t offered any details about potential changes. Notably, Google only revealed the specific details of pausing the practice after the recent backlash.
“Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate,” a Google spokesperson told Gizmodo. “This paused reviews globally.”
Who knows if Google’s investigation will change how Assistant works, especially with regards to privacy. You can currently opt-out of storing audio recordings in your Google account settings or choose to delete your recordings automatically after every three months or every 18 months. There is not an option to delete your recordings more frequently. And since we’re on the topic, you can actually delete a lot of the data Google has collected about you.Here’s a handy guide.
Amazon is a different story. So far, the company has not announced any changes to how it handles Alexa recordings, which means we’re left to assume that Amazon contractors are still listening. Amazon has also historically given users the least amount of privacy protection. Although you can opt-out of letting Amazon use your voice recordings to develop new features and improve transcriptions, you cannot completely opt-out of letting Amazon retain your voice recordings for other purposes.
We reached out to Amazon for comment on the latest voice assistant controversy, and we’ll update this post if the company responds. Heck, we’ll write a whole new post if Amazon announces meaningful changes to Alexa and its handling of user privacy.
For now, it’s hard to guess how these latest revelations will change how voice assistants work, but the reinstatement of human review does seem inevitable. As Gizmodohas previously reported, the artificially intelligent software that powers voice assistants just isn’t sophisticated enough to work well without some human intervention. Humans still need to review certain sets of voice recordings in order to improve the technology’s natural language processing and also to reduce algorithmic bias that still remains from the machine learning’s initial data sets. Without proper training, voice assistants will be less useful.
Trading a little bit of privacy for a better product is an old but increasingly tough proposition. If you want to ask Alexa about the weather, Alexa needs to understand what you’re saying, and it needs to know where you are. You, the user, are also well within your rights to demand that Alexa only listen to you when you wake up the computer. It’s appalling that some voice assistants recording people having sex and then random humans listen to those recordings. Yet, here we are.
Update 6:40 pm:An Amazon spokesperson sent us the following statement:
“We take customer privacy seriously and continuously review our practices and procedures. For Alexa, we already offer customers the ability to opt-out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We’ll also be updating information we provide to customers to make our practices more clear.”