Apple announced today that it's conducting a review of its Siri grading program and, while it's doing so, it is suspending this quality control effort. The company also said that plans to allow users to opt out of this grading program as part of a future update.
Here's Apple's full statement to TechCrunch:
"We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
A recent report from The Guardian said that Apple contracts workers around the world to listen in on Siri user recordings and grade the responses, including whether the activation of Siri was on purpose or an accident and whether Siri's response was appropriate. The report said that these contractors regularly hear things like medical info, people having sex, and drug deals due to accidental activations of Siri, adding that they were concerned about a lack of disclosure.
Following that report, Apple confirmed that less than 1 percent of daily Siri activations are used for grading and that these clips are usually only a few seconds long. Apple also said that user requests aren't associated with a user's Apple ID and that reviewers are "under the obligation to adhere to Apple's strict confidentiality requirements."
We've heard of Amazon and Google workers listening to and analyzing voice assistant requests, so it's not a huge surprise to hear that Apple has people doing the same thing. What the big deal here is that some feel Apple didn't make it clear in its terms of service that people could be listening in on Siri request recordings. Apple has also made privacy a focus of its advertising lately, making this news of humans listening in on Siri requests an even bigger deal.