Linked by Thom Holwerda on Fri 19th Apr 2013 14:09 UTC
Apple "All of those questions, messages, and stern commands that people have been whispering to Siri are stored on Apple servers for up to two years, Wired can now report. Yesterday, we raised concerns about some fuzzy disclosures in Siri's privacy policy. After our story ran, Apple spokeswoman Trudy Muller called to explain Apple's policy, something privacy advocates have asking for." Apple cares about your privacy.
Thread beginning with comment 559366
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[5]: caring
by MOS6510 on Sat 20th Apr 2013 18:59 UTC in reply to "RE[4]: caring"
MOS6510
Member since:
2011-05-12

Apple adjusts itself and its policies I guess like every other company.

Not being able to (easily) service a Mac was something Steve Jobs wanted. When the PowerMacs of the 90´s were made he wasn't around.

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.

Reply Parent Score: 4

RE[6]: caring
by flypig on Sun 21st Apr 2013 02:22 in reply to "RE[5]: caring"
flypig Member since:
2005-07-13

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.


I think you're right. From the text of the article it doesn't make sense to assume the data is anonymised within the first six months (and Apple isn't claiming this either). Here's the statement:

"Apple generates a random numbers to represent the user and it associates the voice files with that number."

This makes no claim to anonymity. My reading of this would be that all of the voice files from your phone get associated with the same number. It may be that Siri doesn't care who the user is, but it follows straightforwardly that the phone and the data files are linkable if Apple cared to do so.

The use of a "random number" in this context is just a diversion. The number is still tied to your phone; it just doesn't happen to be your Apple ID.

If you turn Siri off, only the data that hasn't been anonymised is deleted. At least, this would be my reading of this:

"If a user turns Siri off, both identifiers are deleted immediately along with any associated data" (my emphasis). In other words, data that's no longer associated with the random number is retained (for up to two years).

Reply Parent Score: 4

RE[7]: caring
by MOS6510 on Sun 21st Apr 2013 07:09 in reply to "RE[6]: caring"
MOS6510 Member since:
2011-05-12

Then I think the data is assigned a random ID, thus anyone stealing the data wouldn´t know who is who. However Apple does keep a database that links users to these random numbers. So Apple (and anyone who can access this database) can link random IDs back to real people.

Anyway, I don´t think these voice clips would yield much interesting data for hackers and evil governments. Certainly not compared to many other sources of information.

If you send info via Siri using email or iMessage these messages can be found outside Siri´s voice vault anyway. Web searches done via Siri are also logged by the search provider and the ISP.

Why would you, if you´re a privacy extremist, use voice to operate a device when that means other people could hear you?

Reply Parent Score: 2

RE[6]: caring
by StephenBeDoper on Tue 23rd Apr 2013 23:40 in reply to "RE[5]: caring"
StephenBeDoper Member since:
2005-07-06

Apple adjusts itself and its policies I guess like every other company.


What I think it comes down to is: the more emphatically committed your are to a position, the odder it looks when you reverse that position. And Apple appears to have a much greater emotional attachment to their positions than most corporations, they're not mere business policies to Apple, they're often treated (or at least perceived) almost as matters of objective right and wrong.

E.g. if Apple were to back down from their stance on jailbreaking, it would be almost impossible to do so without being perceived as weak and/or hypocritical. Just the same as if Microsoft were to start releasing (E.g.) Exchange & Sharepoint server software for Linux.

Not being able to (easily) service a Mac was something Steve Jobs wanted. When the PowerMacs of the 90´s were made he wasn't around.


He may have been the root of it, but that attitude clearly went further than just Jobs. Apple didn't immediately start making easily-serviced models after Jobs left (the first time), and they didn't immediately stop making them after he came back.

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.


Indeed, that part does seem a bit vague. As others have suggested, it could be mean that Apple only deletes the data that hasn't been "anonymized" yet.

Reply Parent Score: 2