Linked by Hadrien Grasland on Wed 25th May 2011 15:09 UTC, submitted by sawboss
Apple 40 minutes and physical access. That's all russian company ElcomSoft is claiming to need in order to crack the 256-bit hardware encryption Apple uses to protect the data on iOS 4 devices. Full access to everything that's stored inside, including "historical information such as geolocation data, browsing history, call history, text messages and emails, usernames, [passwords, and even some] data deleted by the user", is obtained.
Order by: Score:
Comment by twitterfire
by twitterfire on Wed 25th May 2011 15:16 UTC
twitterfire
Member since:
2008-09-11

I think that Apple uses weak encryption schemes on purpose. Maybe they or the US govt will need the user data from time to time.

Edited 2011-05-25 15:16 UTC

Reply Score: 3

RE: Comment by twitterfire
by Neolander on Wed 25th May 2011 15:19 UTC in reply to "Comment by twitterfire"
Neolander Member since:
2010-03-08

Probably. If it's DES, 256-bit will still not be enough ;)

Reply Score: 1

Good...
by whartung on Wed 25th May 2011 15:57 UTC
whartung
Member since:
2005-07-06

Good enough is Good enough, until it isn't. Whatever Apple was doing for encryption before was Good Enough. And now it isn't. Now there's motivation on Apple's part for the next level of Good Enough.

Reply Score: 3

RE: Good...
by helf on Wed 25th May 2011 15:58 UTC in reply to "Good..."
helf Member since:
2005-07-06

yeah! 512bit DES! :p

Reply Score: 3

Uhm
by Thom_Holwerda on Wed 25th May 2011 16:03 UTC
Thom_Holwerda
Member since:
2005-06-29

Old news. This was already done by both Fraunhoffer and Charlie Miller.

Reply Score: 1

RE: Uhm
by Neolander on Wed 25th May 2011 19:38 UTC in reply to "Uhm"
Neolander Member since:
2010-03-08

Really ? I honestly didn't know about it...

Well, let's consider this as a draw ;) You have double-posted Ballmer's speech, after all.

Edited 2011-05-25 19:40 UTC

Reply Score: 1

RE[2]: Uhm
by flanque on Thu 26th May 2011 08:14 UTC in reply to "RE: Uhm"
flanque Member since:
2005-12-15

Group hugs.

Reply Score: 2

Now, here comes the question
by twitterfire on Wed 25th May 2011 16:34 UTC
twitterfire
Member since:
2008-09-11

What to do if you really care about your data and you don't want someone else to sneak, take a peak at your data or your traffic? Beside quiting to use devices and computers, staying offline or writing your own encryption scheme?

Entities like NSA have the capacity to decrypt some of the most complex algorithms, OSes and devices have backdoors. Even in OpenBSD was discovered a backdoor some time ago.

Reply Score: 0

RE: Now, here comes the question
by Drumhellar on Wed 25th May 2011 17:07 UTC in reply to "Now, here comes the question"
Drumhellar Member since:
2005-07-12

Entities like NSA have the capacity to decrypt some of the most complex algorithms, OSes and devices have backdoors. Even in OpenBSD was discovered a backdoor some time ago.


Everybody always alleges backdoors in major operating systems and other pieces of software, but nobody provides proof. I tend to believe that the likes of Windows and Solaris do not have backdoors built in, for four reasons:

1. Were it discovered by the public, it would seriously damage the reputation of the company making the software.

2. For the larger targets, they would be discovered by now. It's only a matter of time.

3. Again, I haven't seen evidence ever produced; only accusations and a general assumption that this is true.

4. A few years ago, the US Congress tried to mandate certain types of software and encryption algorithms be made with backdoors. The software industry successfully lobbied against these requirements. It is not in their interests to build holes into their operating systems and encryption algorithms.

As for OpenBSD, there were no backdoor discovered. It was only a former developer who made accusations against another former developer that he was paid to insert backdoors into the IP stack years ago. Further code review showed this to not be the case. Again, accusations without evidence.

EDIT: proofreading helps.

Edited 2011-05-25 17:08 UTC

Reply Score: 7

Bill Shooter of Bul Member since:
2006-07-14

5) ( for windows) Why require a backdoor, when the front door is wide open? Front door access, gives plausible deniability. No.. that wasn't the NSA/CIA/FBI/DIA, it was a hacker who wrote that virus that stole all your files and smashed your centrifuges.

Reply Score: 2

Bill Shooter of Bul Member since:
2006-07-14

Windows security is better than it was, but its not as if viruses are impossible now. Still a better idea for an intelligence agency.

Reply Score: 2

Alfman Member since:
2011-01-28

Bill Shooter of Bul,

"5) ( for windows) Why require a backdoor, when the front door is wide open? Front door access, gives plausible deniability. No.. that wasn't the NSA/CIA/FBI/DIA, it was a hacker who wrote that virus that stole all your files and smashed your centrifuges."

I don't know why your post was downvoted...but an educated guess says it's likely that secret agencies (or even corporate spies) are using published and unpublished vulnerabilities. It's rather irrelevant that they're intentional or not.

If they really want the information, they can always plant a keylogger. Or mount a highres camera where they can record people logging in.

Or, if those are infeasible, recording the sounds of keystrokes by targeting windows with distant lasers is plausible (yet another vulnerability for owners of windows).

http://berkeley.edu/news/media/releases/2005/09/14_key.shtml

Reply Score: 2

JAlexoid Member since:
2009-05-19

The source code for Windows is available to countries that want to analyse it and use it in high security environments. So since it's being used in countries other than US, I bet there is no backdoor.

Reply Score: 2

Nth_Man Member since:
2010-05-16

The source code for Windows is available to countries that want to analyse it and use it in high security environments.

Mmm... they won't know if what they see is... the source code of it... until they will be able to compile it, modify it, compile it again, test the resulting compiled system, etc.

Reply Score: 3

Nth_Man Member since:
2010-05-16

I mean, I can say that I have the secret formula of the coca-cola, and give the recipe to a country. But until they follow the recipe and try the product... they won't know if the "recipe" was really true or not.

Reply Score: 2

JAlexoid Member since:
2009-05-19

Mmm... they won't know if what they see is... the source code of it... until they will be able to compile it, modify it, compile it again, test the resulting compiled system, etc.

And what gives you the idea that the Russians, Chinese, Israelis and French didn't do it?

Reply Score: 2

Alfman Member since:
2011-01-28

"And what gives you the idea that the Russians, Chinese, Israelis and French didn't do it?"

I don't know if they can or cannot compile it themselves, but I somewhat doubt microsoft would allow them that privilege.


The thing is, the notion that anyone having the source knows whether windows is secure seems a little implausible.

Vulnerabilities can creep up in innocent looking code. That would always offer plausible denyability.

When a security update comes in, countries in possession of the code could indeed locate the vulnerabilities in the source, but where is the evidence that it was accidental or deliberate? It's not like MS would label a backdoor as "NSAKey" or something.

Reply Score: 2

JAlexoid Member since:
2009-05-19

They kind-of never had an option of not allowing something when bidding for government contracts.

Reply Score: 2

Nth_Man Member since:
2010-05-16

And what gives you the idea that the Russians, Chinese, Israelis and French didn't do it?

I did not say "they didn't do it". You are talking like if I said that. I said that if someone says "hey, this is the source code of Windows for your country", you don't know if it's true if you are not able to compile it, modify it, compile it again, test the resulting compiled system, etc.

You said "The source code for Windows is available to countries". You don't know that, until you know that they are able to compile it, etc.

Edited 2011-05-28 10:40 UTC

Reply Score: 2

RE: Now, here comes the question
by Soulbender on Wed 25th May 2011 17:39 UTC in reply to "Now, here comes the question"
Soulbender Member since:
2005-08-18

Even in OpenBSD was discovered a backdoor some time ago.


No there wasn't.

Reply Score: 3

twitterfire Member since:
2008-09-11
rr7.num7 Member since:
2010-04-30

No, there wasn't. There were accusations without any proof. Can you provide solid evidence for the existance of that backdoors (like, somenone who actually found them in the code)? Or, are you just one of the thousands of users that believe every gossip they read on the internet?

Edited 2011-05-25 18:01 UTC

Reply Score: 3

Soulbender Member since:
2005-08-18

No there wasn't. Seriously, a backdoor in open source code hidden in plain sight for years? I mean, not just a bug, a backdoor. Go ahead, the code is there, dig out the proof that no-one who has audited the code has found.

If you believe this stuff you might as well consider TMZ a reliable news source.

Reply Score: 4

Defeats the point...
by bert64 on Wed 25th May 2011 19:02 UTC
bert64
Member since:
2007-04-23

The hardware encryption was "cracked" because the key is stored on the device...
Making it not so much encryption, as obfuscation.

If the device is able to boot and operate without the user having to input a key into it, then the key for any encryption is clearly stored on the device just waiting for someone to work out how it can be obtained.
If you want encryption that actually works, you need to ensure the key is never stored with the device - and that means forcing the user to enter it every time the device boots.

Reply Score: 3

Nothing will be secure.
by Finchwizard on Wed 25th May 2011 22:41 UTC
Finchwizard
Member since:
2006-02-01

Nothing is secure because you have the human factor in it.

People are idiots, they will think they're being secure by using a complicated password then they will use that password for everything. Not smart.

Even things like LastPass, keep your stuff secure, then they get hacked and gain some info on accounts.

Happens every day, it's how they then deal with it that's the key issue.

Just think. At least we don't have phones made by Sony.

Reply Score: 2