Linked by Thom Holwerda on Wed 11th Sep 2013 22:16 UTC
Apple

Apple's new iPhone 5S, which comes with a fingerprint scanner, won't store actual images of users' fingerprints on the device, a company spokesman confirmed Wednesday, a decision that could ease concerns from privacy hawks.

Rather, Apple's new Touch ID system only stores "fingerprint data", which remains encrypted within the iPhone's processor, a company representative said Wednesday. The phone then uses the digital signature to unlock itself or make purchases in Apple's iTunes, iBooks or App stores.

In practice, this means that even if someone cracked an iPhone's encrypted chip, they likely wouldn't be able to reverse engineer someone's fingerprint.

This seems relatively safe - but then again, only if you trust that government agencies don't have some sort of backdoor access anyway. This used to be tinfoil hat stuff, but those days are long gone.

I dislike the characterisation of privacy "hawks", though. It reminds me of how warmongering politicians in Washington are referred to as 'hawks", and at least in my view, it has a very negative connotation.

Permalink for comment 571933
To read all comments associated with this story, please click here.
RE[3]: Not an image. Ok...
by galvanash on Thu 12th Sep 2013 14:29 UTC in reply to "RE[2]: Not an image. Ok..."
galvanash
Member since:
2006-01-25

What you define as "doing it right" it's actually "doing it absolutely wrong": If the system can't be used to determine the actual correct fingerprint (the owner's) then it is useless.


Whether it is a password, a fingerprint, a time based key (google authenticator), etc... - it doesn't matter. The authentication system's job is not to know your credentials, and if it does actually know your credentials it is simply not built responsibly. The authentication system only has to determine that you know your credentials, and there are very well established ways to do that without having to ever store them.

I think both of you may be missing the point. If a 3rd party manages to get a hold of the fingerprint signature, they already have all the information they need about said fingerprint. There is no point in "reverse engineer."

The point of a database of finger prints. It's not about reverse engineer the print, but rather to match the signature of an unknown finger print, probably gathered in the field, against a data base of "known" signatures. If there is a positive, then you can easily figure out who that "unknown" signature belongs to, because the positive signature is associated with a specific phone/device and the owner of such is known.


Oh, I understand your point perfectly. What you are describing is a rainbow table ;) Im not arguing that using biometrics is a good idea - I was just answering the specific points brought up. There are many reasons why this is a horrible idea:

1. Fingerprints can't be changed, so if someone figures out how to compromised the authentication system using "fake" fingerprints you are pretty much screwed.

2. You leave them everywhere. Its kind of stupid to trust security to a piece of information that is in fact fairly trivial to acquire. Its like writing a post-it note with your password on it, but you do it virtually every time you touch anything...

3. They are unique enough that they can serve as compelling evidence legally for identification purposes. Knowing someone logged into a system with a password of "foo" is not going to be very useful in identifying a person, because lots of people could be using that password - if you have the hash of a fingerprint and can generate that hash from the suspect's fingerprint... well that is pretty much the opposite.

The first two points are certainly problems, but considering that this is replacing a system that uses a trivial 4 digit numeric passcode by default, well it isn't all that much worse - and it does have some compelling advantages when it comes to simplicity for the user.

The third point (and your main concern) can be dealt with quite effectively - I just don't know if Apple did this responsibly or not. You can make the hash less effective for identification purposes by simply making sure that it has a fair number of collisions - i.e. the odds of two fingerprints resulting in the same hash is say 1 in 10,000 or something like that - far too low to be useful for identification all on its own.

That would make it pretty much useless for the purposes of "drag netting", having the hash would be useless without other supporting evidence, because lots of people could have the same hash. It would also make it fair less secure of course. Considering the intended use case, I would argue that being less secure would actually be the right thing to do. I would really be interested to know what the collision rate actually is...

But I would add that it might also be a moot point. I mean, if the NSA has your phone, and the phone is yours... well they don't really need the fingerprint then do they? They have the phone, if they can get the hash they have already broken its security - there is probably lots of other evidence on it identifying you...

All in all I think the privacy concerns are a red herring. The problem is its just a dumb way to do security. But seeing it is for something most people don't bother securing effectively anyway, I don't really see what the big deal is.

Edited 2013-09-12 14:29 UTC

Reply Parent Score: 4