Linked by Thomas Leonard on Tue 16th Jan 2007 00:32 UTC
General Development In the Free and Open Source communities we are proud of our 'bazaar' model, where anyone can join in by setting up a project and publishing their programs. Users are free to pick and choose whatever software they want... provided they're happy to compile from source, resolve dependencies manually and give up automatic security and feature updates. In this essay, I introduce 'decentralised' installation systems, such as Autopackage and Zero Install, which aim to provide these missing features.
Thread beginning with comment 202851
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[3]: Great article, but...
by Tom5 on Wed 17th Jan 2007 21:33 UTC in reply to "RE[2]: Great article, but..."
Tom5
Member since:
2005-09-17

Why can't the directory name be the program name (as opposed to the URL)--i.e. gimp-2.3-- so it can be installed off of CD? I don't understand what about making the folder a hash makes it more secure than, say, storing the hash information in a separate protected file within the program's folder.

OK, so Alice puts a CD in the drive with 'gimp-2.3.tgz' on it. How can the system know that it's genuine? Anyone could make such a CD. The system can't tell, so it can't share it with Bob.

But, if the CD contains an archive called 'sha256=XYZ.tgz', then it can be put in the shared directory. The system can see that it's correct just by comparing the archive's contents to its name.

Reply Parent Score: 1

RE[4]: Great article, but...
by Moochman on Thu 18th Jan 2007 08:03 in reply to "RE[3]: Great article, but..."
Moochman Member since:
2005-07-06

Hmm... while your solution does sound perhaps a bit more elegant, I still don't see why the system couldn't extract an identifier text file from the archive and then compare it to the archive's contents. Also, just to be clear: comparing the hash to the contents wouldn't do a thing to ensure security all by itself; it would also need to be compared to the hash at the project's website, right? Otherwise anyone could create malware and provide a hash to match it, but make it look like normal software. Furthermore, don't the archive contents have to be re-analyzed every time you want to verify their authenticity? So given that the website needs to be accessed and the hash needs to be recalculated in any case, couldn't we just skip the step with the local copy of the hash?

To rephrase: The hash being stored in the local filesystem does very little to ensure integrity of the program. Only by checking the folder's contents against an online hash of its contents can ensure the program's security, which effectively renders the local copy of the hash useless.

Or am I missing something?

Reply Parent Score: 2

RE[5]: Great article, but...
by Tom5 on Thu 18th Jan 2007 17:52 in reply to "RE[4]: Great article, but..."
Tom5 Member since:
2005-09-17

Otherwise anyone could create malware and provide a hash to match it, but make it look like normal software.

Yes, just because something is in the shared directory doesn't mean it's safe to run it. One reason why unfriendly names are OK here is that you really don't want users browsing around running things that just look interesting!

Furthermore, don't the archive contents have to be re-analyzed every time you want to verify their authenticity?

No, that's why you have the privileged helper. It checks the digest once and then adds it. So, if you see a directory called:

/shared-directory/sha256=XXXXXXX

then you don't have to calculate the XXXXXXX bit yourself. If it didn't match, it wouldn't have been allowed in.

BTW, you don't need to use the web to check the hash. It may be that Alice and Bob both trust the CD (in which case they get to share the copy on it). Denise doesn't trust the CD, so she checks with the web-site instead (and will share the copy only if it matches).

Reply Parent Score: 1