Siri—and the Humans Behind Her—Might Have Heard Some of Your Most Private Moments

SIphotography/iStock via Getty Images
SIphotography/iStock via Getty Images | SIphotography/iStock via Getty Images

An Apple contractor has come forward with alarming—yet not altogether surprising—information about just how much of your life Siri and her human referees may have heard. The anonymous whistleblower told The Guardian that “there have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on.”

Here’s how that happens. Apple sends a small percentage (less than 1 percent) of Siri activations to outside contractors, who then listen to the recordings to discern whether the activation was on purpose or by accident, whether or not the request was something Siri could fulfill, and whether or not the virtual assistant responded appropriately. This “grading” process is designed to help Apple developers with quality control and improvement, but according to The Guardian, the tech titan doesn’t actually disclose to consumers that humans might be listening to their interactions. Even some of Siri’s banter muddles the truth; for example, the system responds to the question “Are you always listening?” with “I only listen when you’re talking to me.”

An Apple representative told The Guardian that the recordings aren’t grouped with other recordings from the same user, and they’re not linked to your Apple ID. But The Guardian’s source explained that recordings can include addresses, names, and other personal information that would make it relatively simple to track down a user if you wanted to. And, since Apple uses independent contractors for this work, “there’s not much vetting of who works there, and the amount of data that we’re free to look through is quite broad … It’s not like people are being encouraged to have consideration for people’s privacy.”

The reason so many sensitive conversations are captured by Siri in the first place is because the virtual assistant is easily activated by accident. It starts recording whenever it registers the phrase “Hey Siri,” or anything that remotely resembles it, sometimes even just the sound of a zipper. The contractor emphasized that the Apple Watch and HomePod smart speaker are most often the culprits of accidental recordings.

The good news is that Apple now seems to understand how much this news has probably freaked you out. The Guardian reported today that Apple has suspended its grading program indefinitely while it conducts a review. “We are committed to delivering a great Siri experience while protecting user privacy,” the statement said. “As part of a future software update, users will have the ability to choose to participate in grading.”