August 6, 2019
Apple will no longer allow human contractors to listen in on users’ Siri recordings to “grade” them, and will not resume the program until it has completed a “thorough review.” The company also stated that, as part of a future update, users will be able to opt out of the quality assurance program. Apple told users that Siri data might be used to improve their experience, but not that humans would be listening to it. Most of the compromised confidential information was recorded via an accidental trigger of Siri.
The Guardian, which broke the original story, reports that, “contractors working for Apple in Ireland … were sent home for the weekend after being told the system they used for the grading ‘was not working’ globally.” Among the conversations that contractors heard were “in-progress drug deals, medical details and people having sex.” The Apple Watch, apparently, was “particularly susceptible to such accidental triggers,” recording up to 30 seconds.
“Too often we see that so-called ‘smart assistants’ are in fact eavesdropping,” said the U.K.’s Big Brother Watch director Silkie Carlo. “Apple’s record on privacy is really slipping. The current iOS does not allow users to opt out of face recognition on photos. And this revelation about Siri means our iPhones were listening to us without our knowledge.”
Amazon and Google contractors also use human contractors for quality assurance of their digital assistants, according to Belgian TV channel VRT and Bloomberg. A leak of more than 1,000 audio clips from Google Home revealed that “at least one in 10 had been captured accidentally,” a fact that prompted Hamburg, Germany’s data protection commissioner to “ban Google from carrying out those checks for three months, citing a likely breach of the general data protection regulation (GDPR).” Google said it had already stopped human surveillance when evidence of the audio clip leak was revealed.
The Hamburg commissioner did not ban Amazon or Apple “because those companies have their German head offices in Munich and are thus covered by a different commissioner under that country’s privacy laws,” but he urged other regulators to “quickly check for other providers of language assistance systems, such as Apple or Amazon, to implement appropriate measures”. In the EU, Amazon is “the only major provider of voice assistant technology still using humans to check recordings.”