Linked by Thom Holwerda on Fri 19th Apr 2013 14:09 UTC
Apple "All of those questions, messages, and stern commands that people have been whispering to Siri are stored on Apple servers for up to two years, Wired can now report. Yesterday, we raised concerns about some fuzzy disclosures in Siri's privacy policy. After our story ran, Apple spokeswoman Trudy Muller called to explain Apple's policy, something privacy advocates have asking for." Apple cares about your privacy.
Thread beginning with comment 559364
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[4]: caring
by StephenBeDoper on Sat 20th Apr 2013 18:19 UTC in reply to "RE[3]: caring"
StephenBeDoper
Member since:
2005-07-06

If you care to give a few examples I may be able to agree.


Well, I was being glib (and how often do I get such a perfect opportunity to use my favourite Churchill quote?).

But off the top of my head, Apple does have a general history of starting from fairly hard-line "our way or the highway" positions, and then slowly opening up over time. E.g. the progression from the original "hermetically sealed" Macs to the very-easy-to-service PowerMacs of the mid-90s. Or the progression from the original iPhone's "pre-loaded and web-apps only" to allowing third-party development. Or relaxing some of the app store approval terms, particularly the ones related to cross-platform dev tools.

In this particular situation, not disclosing the retention policy for Siri data in Apple's privacy policies could arguably be a violation of privacy laws. At least in Canada, the privacy laws state that organizations who collect personally-identifiable information must provide a privacy policy that clearly spells out what information is collected, what the information will be used for, and how long it will be retained.

I say "arguably" because it's debatable whether or not the Siri data qualifies as personally-identifiable information.

Reply Parent Score: 3

RE[5]: caring
by MOS6510 on Sat 20th Apr 2013 18:59 in reply to "RE[4]: caring"
MOS6510 Member since:
2011-05-12

Apple adjusts itself and its policies I guess like every other company.

Not being able to (easily) service a Mac was something Steve Jobs wanted. When the PowerMacs of the 90´s were made he wasn't around.

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.

Reply Parent Score: 4

RE[6]: caring
by flypig on Sun 21st Apr 2013 02:22 in reply to "RE[5]: caring"
flypig Member since:
2005-07-13

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.


I think you're right. From the text of the article it doesn't make sense to assume the data is anonymised within the first six months (and Apple isn't claiming this either). Here's the statement:

"Apple generates a random numbers to represent the user and it associates the voice files with that number."

This makes no claim to anonymity. My reading of this would be that all of the voice files from your phone get associated with the same number. It may be that Siri doesn't care who the user is, but it follows straightforwardly that the phone and the data files are linkable if Apple cared to do so.

The use of a "random number" in this context is just a diversion. The number is still tied to your phone; it just doesn't happen to be your Apple ID.

If you turn Siri off, only the data that hasn't been anonymised is deleted. At least, this would be my reading of this:

"If a user turns Siri off, both identifiers are deleted immediately along with any associated data" (my emphasis). In other words, data that's no longer associated with the random number is retained (for up to two years).

Reply Parent Score: 4

RE[6]: caring
by StephenBeDoper on Tue 23rd Apr 2013 23:40 in reply to "RE[5]: caring"
StephenBeDoper Member since:
2005-07-06

Apple adjusts itself and its policies I guess like every other company.


What I think it comes down to is: the more emphatically committed your are to a position, the odder it looks when you reverse that position. And Apple appears to have a much greater emotional attachment to their positions than most corporations, they're not mere business policies to Apple, they're often treated (or at least perceived) almost as matters of objective right and wrong.

E.g. if Apple were to back down from their stance on jailbreaking, it would be almost impossible to do so without being perceived as weak and/or hypocritical. Just the same as if Microsoft were to start releasing (E.g.) Exchange & Sharepoint server software for Linux.

Not being able to (easily) service a Mac was something Steve Jobs wanted. When the PowerMacs of the 90´s were made he wasn't around.


He may have been the root of it, but that attitude clearly went further than just Jobs. Apple didn't immediately start making easily-serviced models after Jobs left (the first time), and they didn't immediately stop making them after he came back.

One thing I do find strange is that Siri data is anonymized, i.e. not tied to a person, but if that person turns Siri off the data is deleted. So person and data were still linked.


Indeed, that part does seem a bit vague. As others have suggested, it could be mean that Apple only deletes the data that hasn't been "anonymized" yet.

Reply Parent Score: 2