In a turn of events that, at this point, probably shocks no one, Microsoft has confessed that human contractors have been listening in on some people's private conversations via Skype and Cortana.
Vice first broke the news on the privacy breach earlier this month, after a contractor passed along documents, screenshots, and actual audio files of some conversations. “The fact that I can even share some of this with you shows how lax things are in terms of protecting user data,” the contractor, whose name was withheld (for obvious reasons), told Motherboard. Unlike Apple’s recent Siri snafu, these conversations didn’t include potential criminal activity, but they did catch intimate exchanges about weight loss, love, and relationship problems.
Also unlike Apple: Microsoft is not suspending its practices. Instead, the tech monolith has updated its privacy policy to clarify that humans might, in fact, be eavesdropping on you.
“We realized, based on questions raised recently, that we could do a better job of clarifying that humans sometimes review this content,” a Microsoft representative told Vice. Before, the Skype website had mentioned that your content could be analyzed in order to improve the technology, but it never explicitly stated that humans would be listening to it.
Microsoft only records Skype conversations that use its translation features, in order to “help the translation and speech technology learn and grow,” according to the Skype FAQ section. If you’re not using translation features, your sweet nothings are reportedly as private as you want them to be. The updated FAQ section also now states that “Microsoft employees and vendors” may be transcribing the translated audio, and the procedures are “designed to protect users’ privacy, including taking steps to de-identify data, requiring non-disclosure agreements with vendors and their employees,” and more.
But Cortana’s data gathering isn’t limited to translation. According to its support page, Microsoft can collect your voice data literally any time you “use your voice to say something to Cortana or invoke skills.” If that worries you, we recommend spending some time adjusting the settings on your Microsoft Privacy Dashboard.