Apple cleans up Siri's privacy problems, screwing over workers in the process

Good news everybody: Apple's really sorry about recording our conversations with Siri. In a statement issued earlier today, the company's talking heads stated that they realized that the '...haven’t been fully living up to our high ideals'. The letter goes on to say that, to make up for their eavesdropping shenanigans, Apple's going to be making a few changes to how Siri does its thing.

From Apple:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri

This of course, is great news for anyone that uses Apple's Siri voice assistant. Unfortunately, that less people will be needed to snoop on the conversations between the companies customers and their tech likely means that some resources will need to be shifted around in order to accomoda—wait, what?

From The Guardian:

Hundreds of Apple workers across Europe who were employed to check Siri recordings for errors have lost their jobs after the company announced it was suspending the programme earlier this month.

Read the rest

Apple's contractors also listening to private conversations

Android apps are tracking your every move. Amazon is watching and listening. Google's watching you watch porn. Facebook is up all of our shit, all of the time. Perhaps it shouldn't come as any surprise that Apple, a company that's been flogging user privacy as one of the greatest selling points of their mobile devices, is listening in on many of their customers as well.

From The Verge:

Apple is paying contractors to listen to recorded Siri conversations, according to a new report from The Guardian, with a former contractor revealing that workers have heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and even possible drug deals.

According to that contractor, Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors, like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.

According to The Verge, Apple admitted to The Guardian (I'd love to quite this stuff directly, but European copyright laws yadda yadda) that a 'small number' of user interactions with Siri are analyzed to improve the virtual assistant and to buff up the dictation abilities of Apple's various operating systems. They also note that less than 1% of all user interactions are analyzed in this manner and claim that when they do their picking through of our private conversations, the audio they're focusing on has no user information attached to it. Read the rest