Linked by Thom Holwerda on Fri 19th Apr 2013 14:09 UTC
Apple "All of those questions, messages, and stern commands that people have been whispering to Siri are stored on Apple servers for up to two years, Wired can now report. Yesterday, we raised concerns about some fuzzy disclosures in Siri's privacy policy. After our story ran, Apple spokeswoman Trudy Muller called to explain Apple's policy, something privacy advocates have asking for." Apple cares about your privacy.
Permalink for comment 559387
To read all comments associated with this story, please click here.
RE[8]: caring
by flypig on Sun 21st Apr 2013 09:10 UTC in reply to "RE[7]: caring"
Member since:

Then I think the data is assigned a random ID, thus anyone stealing the data wouldn´t know who is who. However Apple does keep a database that links users to these random numbers. So Apple (and anyone who can access this database) can link random IDs back to real people.

Yes, I'd entirely agree with you about that. I guess, for privacy extremists, the important point is that law enforcement could do this, and in many countries probably wouldn't need a warrant to do so.

Anyway, I don´t think these voice clips would yield much interesting data for hackers and evil governments. Certainly not compared to many other sources of information.

In the UK (and certainly many other countries too) companies are required by law to provide any personally identifiable information held about you if you request it. It would be a really interesting exercise to request this to get an idea about what someone could infer from it. Unfortunately I don't use Siri to test this myself.

Why would you, if you´re a privacy extremist, use voice to operate a device when that means other people could hear you?

I guess privacy extremists want to have good functionality too? At the moment Siri is optional, but in the future this might be how all interaction are performed?

As a privacy extremist, the question I'd be asking is "why is Apple collecting this data, given there's no technical benefit to them doing so?". I can understand the technical benefit to them storing unidentifiable voice recordings for future testing (although how you completely anonymise what might normally be considered a biometric identifier is an interesting question), but why not remove the linkability from the outset? Personally I'd argue (and maybe this does make me a privacy extremist!) that any benefit to me of them doing this is worth less than the privacy lost to me as a result.

The same is true of Google, Microsoft, Canonical, etc. incidentally.

Reply Parent Score: 3