Linked by Thom Holwerda on Fri 19th Apr 2013 14:09 UTC
Apple "All of those questions, messages, and stern commands that people have been whispering to Siri are stored on Apple servers for up to two years, Wired can now report. Yesterday, we raised concerns about some fuzzy disclosures in Siri's privacy policy. After our story ran, Apple spokeswoman Trudy Muller called to explain Apple's policy, something privacy advocates have asking for." Apple cares about your privacy.
Thread beginning with comment 559386
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[7]: caring
by MOS6510 on Sun 21st Apr 2013 07:09 UTC in reply to "RE[6]: caring"
MOS6510
Member since:
2011-05-12

Then I think the data is assigned a random ID, thus anyone stealing the data wouldn´t know who is who. However Apple does keep a database that links users to these random numbers. So Apple (and anyone who can access this database) can link random IDs back to real people.

Anyway, I don´t think these voice clips would yield much interesting data for hackers and evil governments. Certainly not compared to many other sources of information.

If you send info via Siri using email or iMessage these messages can be found outside Siri´s voice vault anyway. Web searches done via Siri are also logged by the search provider and the ISP.

Why would you, if you´re a privacy extremist, use voice to operate a device when that means other people could hear you?

Reply Parent Score: 2

RE[8]: caring
by flypig on Sun 21st Apr 2013 09:10 in reply to "RE[7]: caring"
flypig Member since:
2005-07-13

Then I think the data is assigned a random ID, thus anyone stealing the data wouldn´t know who is who. However Apple does keep a database that links users to these random numbers. So Apple (and anyone who can access this database) can link random IDs back to real people.


Yes, I'd entirely agree with you about that. I guess, for privacy extremists, the important point is that law enforcement could do this, and in many countries probably wouldn't need a warrant to do so.

Anyway, I don´t think these voice clips would yield much interesting data for hackers and evil governments. Certainly not compared to many other sources of information.


In the UK (and certainly many other countries too) companies are required by law to provide any personally identifiable information held about you if you request it. It would be a really interesting exercise to request this to get an idea about what someone could infer from it. Unfortunately I don't use Siri to test this myself.

Why would you, if you´re a privacy extremist, use voice to operate a device when that means other people could hear you?


I guess privacy extremists want to have good functionality too? At the moment Siri is optional, but in the future this might be how all interaction are performed?

As a privacy extremist, the question I'd be asking is "why is Apple collecting this data, given there's no technical benefit to them doing so?". I can understand the technical benefit to them storing unidentifiable voice recordings for future testing (although how you completely anonymise what might normally be considered a biometric identifier is an interesting question), but why not remove the linkability from the outset? Personally I'd argue (and maybe this does make me a privacy extremist!) that any benefit to me of them doing this is worth less than the privacy lost to me as a result.

The same is true of Google, Microsoft, Canonical, etc. incidentally.

Reply Parent Score: 3

RE[9]: caring
by MOS6510 on Sun 21st Apr 2013 10:10 in reply to "RE[8]: caring"
MOS6510 Member since:
2011-05-12

The data is collected by Apple in the first place because Siri commands are processed by Apple servers, not localy on your iPhone.

The only reason they hold on to this data, I can think off, is to see what commands failed, why they failed and how to improve Siri. A better working Siri sells more iPhones.

Apple sells Apple products and media/apps via the iTunes store. Siri info can´t sell more apps or music/movies, but it can improve Siri.

Google tries to get as much info about you as they can and link it all up. If Siri was owned by Google it would be far more serious.

Even if Siri is not a serious privacy risc it seems Apple protects the data well enough. Besides, Siri is optional and you can easily turn it off.

I think in the future voice controlling will improve, but you´ll still use your fingers mostly. Talking to a computer or Phone all day is not pleasant and very annoying for people nearby.

Reply Parent Score: 2