Linked by davidiwharper on Tue 14th Jan 2014 09:03 UTC
Mozilla & Gecko clones

Mozilla plans to establish an automated process which would verify that binaries contain only the code found in the official source repositories, and not spyware secretly added during the build process at the behest of government intelligence agencies. In a blog post entitled Trust but Verify, CTO Brendan Eich and R&D VP Andreas Gal note that governments "may force service operators [such as Mozilla] to enable surveillance (something that seems to have happened in the Lavabit case)" and pledge to develop systems which will make Firefox resistant to this form of tampering.

Thread beginning with comment 580652
To read all comments associated with this story, please click here.
Commendable, but possiblly flawed
by flypig on Tue 14th Jan 2014 10:36 UTC
flypig
Member since:
2005-07-13

Even if this whole idea sounds paranoid (the threshold for paranoia seems to have moved since June last year!), it looks to me like a very commendable effort. Unless I've misunderstood, it won't offer any more assurance than compiling the code yourself, but that's more than most of us get now.

However, they'll need to take real care to avoid it being false reassurance. It's great that the blog post cites Ken Thompson's seminal Turing Award Lecture, but what's proposed won't tackle this. The whole point of the lecture was that checking the full source (of both the program *and* the compiler being used) isn't enough to guarantee there are no back doors.

It's a really interesting idea anyway, especially if they actually implement it. If they can get it right, so that end-users can have justified trust, it'll be impressive.

Edited 2014-01-14 10:39 UTC

Reply Score: 6

Alfman Member since:
2011-01-28

flypig,

"Unless I've misunderstood, it won't offer any more assurance than compiling the code yourself, but that's more than most of us get now."

Well, for most it's probably even more secure than compiling the code ourselves because most of us don't even conduct a cursory inspection of the code first. There's a very good chance that a source code based back door can be installed undetected even if it's not concealed. If no one looks at it, it might as well be commented in all caps "HERE BE A BACKDOOR".

The benefit of using the code verification network is that the entire network would need to be compromised simultaneously in order to not trigger a suspicious hash conflict. If ANYONE on the network is doing a good job monitoring the code for backdoors, then a hash verified binary copy is probably more secure than a copy compiled from source by an end user.


The only problem with this is that the hash signature *should* be verified out of band. If I download X and hash(X) from the same compromised channel, then I really cannot know for sure that they aren't both compromised. In theory, certificate authorities are the answer for man in the middle attacks such as this, however fraudulent certificates happen, and it's almost a sure thing that some of the CAs are a front for covert government operations anyways.

Edited 2014-01-14 17:23 UTC

Reply Parent Score: 4

flypig Member since:
2005-07-13

Well, for most it's probably even more secure than compiling the code ourselves because most of us don't even conduct a cursory inspection of the code first. There's a very good chance that a source code based back door can be installed undetected even if it's not concealed. If no one looks at it, it might as well be commented in all caps "HERE BE A BACKDOOR".

I see your point: the equivalence would be to installing from source, where I have some certainty that the source I'm using is the same as the source other people are using (and presumably auditing).

If ANYONE on the network is doing a good job monitoring the code for backdoors, then a hash verified binary copy is probably more secure than a copy compiled from source by an end user.

Yes, but it does still rely on this happening. Whether I compile from integrity-checked source, or use a binary that has been verifiably generated from a given source tree, I still have to rely on the assumption that someone else audited it, the libraries it relies on, and all previous versions of the compiler (since there's no way I'm doing that myself!).

It's a good initiative and reducing the requirement to trust a single organisation makes a lot of sense. If only I could apply the same technique to all of the other technologies I use regularly.

Reply Parent Score: 4

Lennie Member since:
2007-09-22

The whole point of the lecture was that checking the full source (of both the program *and* the compiler being used) isn't enough to guarantee there are no back doors.


But I think it's probably the best solution we have.

I believe it is even possible these days to compare with the output of multiple open source compilers.

And obviously we can the code history in git which checksums everything.

Reply Parent Score: 3