Apple said it's suspending its practice of allowing contractors to listen to conversations recorded by Siri.
A recent report from The Guardian revealed that contractors listen to a small fraction of anonymous conversations in order to quality check Siri's ability to identify and answer questions. Apple's privacy agreement does disclose that it uses your data to improve Siri, but doesn't say some recordings are being listened to.
The report said Apple contractors heard confidential or private information regularly — including medical information, drug deals and couples having sex.
Google and Amazon also allow employees to sample conversations in order to quality check their voice assistants. But, they both allow you to review and delete the recordings, and Amazon lets you opt out.
Now, Apple says it's stopping the program while it "conduct[s] a thorough review." It also said that users will be able to opt out of the quality checks in a future software update.