Apple sends home workers who listened to intimate Siri recordings and apologises for privacy breach

Contractors often overheard confidential encounters, including drug deals and people having sex

Anthony Cuthbertson
Wednesday 28 August 2019 21:13 BST
Comments
Apple and Google workers reported 'regularly' hearing private recordings
Apple and Google workers reported 'regularly' hearing private recordings

Apple has sent home hundreds of workers across Europe who listened to Siri recordings, according to a report.

The move comes after Apple suspended a "grading" program that employed contractors to evaluate customers' audio recordings.

Previous reports revealed that the workers often overheard confidential encounters, including drug deals and people having sex.

More than 300 workers in Cork alone were fired as part of the cull, according to The Guardian, with each given just one week's notice. The Independent has reached out to Apple for comment.

In a blog post today, Apple said it was in the process of reviewing its Siri audio program and would reintroduce it later this year as part of a software update.

"We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process - which we call grading," Apple said.

"We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies... As a result of our review, we realise we haven't been fully living up to our high ideals, and for that we apologise."

Apple was not alone in gathering and analysing recordings made to its virtual assistant, with Google and Amazon both admitting to the practise in recent months.

Google said it would stop contractors from listening to and transcribing conversations captured by its Google Assistant, while Amazon announced that customers could opt out of a similar program designed to improve the functionality of its Alexa smart assistant.

The assistants are built to allow users to make hands free calls, send messages and search the internet using voice commands alone.

All can be voice activated, meaning their inbuilt microphones are constantly listening for keywords that will trigger the assistant, such as "hey Siri".

If the wake word is accidentally triggered, for example by a similar word like "Syria", then audio could be accidentally recorded and sent to workers to be analysed.

In some cases, this resulted in intimate moments being reviewed by Apple contractors. One whistleblower described "countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on."

All recordings were reportedly accompanied with user data showing location, contact details, and app data.

Changes to Apple's Siri privacy policy include allowing users to opt in to the grading process, being able to remove inadvertent recordings, and disassociating a user's device data from the recordings after six months.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in