Linked by Thomas Leonard on Tue 16th Jan 2007 00:32 UTC
General Development In the Free and Open Source communities we are proud of our 'bazaar' model, where anyone can join in by setting up a project and publishing their programs. Users are free to pick and choose whatever software they want... provided they're happy to compile from source, resolve dependencies manually and give up automatic security and feature updates. In this essay, I introduce 'decentralised' installation systems, such as Autopackage and Zero Install, which aim to provide these missing features.
Order by: Score:
yay!
by zhulien on Tue 16th Jan 2007 01:36 UTC
zhulien
Member since:
2006-12-06

I say this all the time on digg and only get dugg down. Sadly the greater linux community doesn't like the idea of not having to install software. Installation of software is a pathetic idea which Windoze and Linux seem to embrace. Luckily for MacOS and AmigaOS, it isn't usually a necessity to run an installer of any type - thank god these two OSs exist.

Reply Score: 1

RE: yay!
by cerbie on Tue 16th Jan 2007 04:17 UTC in reply to "yay!"
cerbie Member since:
2006-01-02

Hiding what it's doing means it's not installed? The files it saves its settings to don't count for installation, nor in some cases the virtual mounting thingie (yeah, I forgot the term)? It's nice packaging (I think still better than Klick), but is no more not installed than any other OS--the interface is just really simple, right down the file level.

For ports and things not available in such nice packages, it's generally more difficult. Of course the idea of using many directories off the root at the same time for a piece of software, rather than those being within it, is partly to blame (but it can save space).

IMO, since a Linux distro has finite software available in the repository, you should have all of it available through the GUIs, with some indicator that it is not installed, ten installing it upon first run. This would make things easy as a user, and still not annoyingly hide things if you want to do it some other way; or look inside.

Then, for third-party, just have scripts to support use of RPM and/or similar from the file managers (as in, you click, and it installs, and if the info is in there, even runs--I'm not sure of all the metadata in them).

Reply Score: 2

RE: yay!
by butters on Tue 16th Jan 2007 04:17 UTC in reply to "yay!"
butters Member since:
2005-07-08

Huh? The only alternative to installing software is SaaS (software as a service). Even dragging and dropping a compressed archive falls under the category of software installation.

No, I don't think I'm being pedantic here. The first time I had to install software on a Mac, it took me a little while to figure out how to get it to install the software permanently rather than run it out of the disk image. Even then it didn't add the program to the Dock automatically. It made me feel stupid that I had mastered Gentoo yet had problems with the "intuitive" MacOS X. I'm not saying it's harder to install software on a Mac than on other systems, but it's not 100% intuitive for everybody. Very few things are.

Further, how do I keep my system up to date? Is there a single command or button? What if the upstream distributor moves to a different web address? Why should I trust a third party to deliver software that integrates nicely with the rest of my system?

I think that searching the web and downloading some compressed archive is a pathetic idea from "Windoze" and MacOS X. Package management takes the guess work out of finding, installing, and updating software, while providing some protection against malicious packages, I might add.

To each his own... but I personally feel that the most challenging aspect of package management for newbies is that it's different.

I see why you get "dugg down" a lot. Notice how you make a strong assertion and then back it up with nothing? You could be right, but you're not going to change anyone's mind like this.

Edited 2007-01-16 04:19

Reply Score: 5

Good article
by flanque on Tue 16th Jan 2007 01:45 UTC
flanque
Member since:
2005-12-15

This is a good article and a great attempt to explain the problems, cases, resolutions and summaries of what I feel is a common problem in the FOSS environment.

I guess there is the arguement that this system just adds yet another installation choice adding another layer of complication, especially for less technical users. This is true, but only if a common installation method isn't adopted by the majority of Linux vendors.

Frankly, I think it's very much time to start settling on a single common installation method / package manager. It's certainly a hurdle that I feel is impacting to a degree adoption of Linux.

Installation and compilation via source should always be there. It exists on other platforms and offers a degree of freedom that developers and more technically savvy users want, however for the rest a common solution needs to be used across all distributions.

Fingers crossed.

Edited 2007-01-16 01:46

Reply Score: 2

RE: Good article
by butters on Tue 16th Jan 2007 03:51 UTC in reply to "Good article"
butters Member since:
2005-07-08

Frankly, I think it's very much time to start settling on a single common installation method / package manager. It's certainly a hurdle that I feel is impacting to a degree adoption of Linux.

My stock argument here is that, like everything in FOSS, standardization will follow consensus. While competing package formats are certainly not a desirable feature, neither is standardizing on a substandard packaging system. Last I checked, still no consensus on the best package manager.

However, one this is for sure: Linux users are being served far better by RPM and APT than Klik, Autopackage, Zero Install, et al. Show me a distro that successfully uses one of these self-contained package formats by default, and I'll certainly reconsider.

Reply Score: 5

RE[2]: Good article
by de_wizze on Tue 16th Jan 2007 04:39 UTC in reply to "RE: Good article"
de_wizze Member since:
2005-10-31

While I agree that the centralized approach is better, I think somehow encouraging original developers to make applications available in one of those packaging systems can help streamline the packaging and delivery of new versions in repositories to users. Which is one of the points I think he was trying to make. I don't know how familiar the author is with Conary/rPath[1] but I seem to recall it functions based around a similar concept.

While I don't think it needs to be the case that distros abandon their current packaging systems, for things like backwards compatibility, wider ISV adoption and maybe even the reclamation of effort packagers, it would be a good thing for them to look into adapting their processes along these lines. Maybe modifying build tools to import those universal formats.

The idea of making cross distro collaboration easier is not new. Various[2] other[3] efforts[4] seem to attempt addressing interoperability amongst distros without having to give up individuality and focus. I think it is critical at this point to help continue to foster growth and development especially in areas such as polish and refinement.

[1] http://wiki.rpath.com/wiki/Conary
[2] https://launchpad.net/distros
[3] http://freedesktop.org/wiki/Home
[4] http://www.pathname.com/fhs/

Edited 2007-01-16 04:53

Reply Score: 4

RE[3]: Good article
by butters on Tue 16th Jan 2007 05:00 UTC in reply to "RE[2]: Good article"
butters Member since:
2005-07-08

Definitely a nod for Conary, thanks for bringing that up. Simply put, Conary is like a hybrid revision control system and packaging system. Very promising, because it does what the other distributed packaging systems don't: it actually attempts to make the upstream developer's and packager's jobs easier instead of focusing solely on the end user.

Less so for Launchpad... While it's great that Canonical wants to help OSS projects coordinate and work together, this does not seem like an appropriate opportunity for Canonical to go proprietary. This is not a matter of "proprietary == bad." If Canonical came out with a proprietary remote system administration console, for example, that would be great for the Linux community. But a collaboration tool targeted at free software projects? They had to know this was not going to be acceptable for many projects.

If you want to try and lock-in your corporate customers, go right ahead. If you want to sell premium add-ons to end users, I have no problem with that. But please don't insult the development community by peddling proprietary development tools. That is sooo not in the spirit of free software.

Reply Score: 3

RE[2]: Good article
by Lambda on Tue 16th Jan 2007 16:43 UTC in reply to "RE: Good article"
Lambda Member since:
2006-07-28

However, one this is for sure: Linux users are being served far better by RPM and APT than Klik, Autopackage, Zero Install, et al. Show me a distro that successfully uses one of these self-contained package formats by default, and I'll certainly reconsider.

Nonsense. Linux users aren't being served better just because something just happens to exist in its current form.

Reply Score: 2

FInally...
by archiesteel on Tue 16th Jan 2007 02:16 UTC
archiesteel
Member since:
2005-07-02

...a thread where discussion of software installation will be on-topic. :-)

Kudos to the author for the excellent presentation. these are very interesting project, very much in the spirit of FOSS. Still, I don't think that such an approach should replace traditional package management altogether. I think the author indicates this in the last paragraph:

Finally, we saw that it is often possible to convert between these different formats, with varying degrees of success. Even if most users don't start using decentralised packaging right now, but continue with their existing centralised distributions, these techniques are useful to help the process of getting packages into the distributions in the first place.

This is a very good approach, IMO. Package managers fill a need right now, and remain very good for system software. This allows for a smooth transition where it makes sense.

That said, I still think that this is not currently a barrier to Linux adoption. None of the people who have asked me about Linux have ever mentioned ease of installation of applications as a factor, or asked if software versions were up-to-date. So while I think this is a laudable effort to unify and simplify software installation, as well as improve the communication between users, developers and distros, I would place false hopes that such a system would necessarily attract more users.

That said, I hope solutions such as these become more prevalent, especially for "standalone" apps.

Edited 2007-01-16 02:18

Reply Score: 4

RE: FInally...
by Lambda on Tue 16th Jan 2007 16:45 UTC in reply to "FInally..."
Lambda Member since:
2006-07-28

That said, I hope solutions such as these become more prevalent, especially for "standalone" apps.

One of the problems is that the independent developer has to be accepted into some repository universe. That's problematic.

I'll add that "system" software would probably best be served by a centralized mechanism, as it is on Windows and OSX, even though the need on Unix tends to be less because of more decentralized (or modular) components than windows.

Edited 2007-01-16 16:47

Reply Score: 4

havent i heard this before?
by mtzmtulivu on Tue 16th Jan 2007 02:21 UTC
mtzmtulivu
Member since:
2006-11-14

..for this to "work", all distro must use the same compiler( or only versions that are compatible with one another), the same libraries( or only those that are compatible with one another ), the same kernel ( or only those that dont break anything btw releases), all maintainers must agree on the same patches( or all distro must run the same vanilla kernel) ..all distro must sit and agree on the "right" direction before start moving or have the big players dictates what they want on others ..and the list goes on ..

the dynamics of FOSS development more or less demands this kind of "craziness" ..having everybody lining up and have only the front person start moving before the one behind him/her will slow everything down ..

"Do we need this many people working on essentially the same task?" it looks that way, some task will have to be repeated while others work on different paths advancing everything forward ..

Reply Score: 4

Tried this once
by John Nilsson on Tue 16th Jan 2007 02:58 UTC
John Nilsson
Member since:
2005-07-06

I remember trying to kickstart a discussion to solve this some time ago[1], didn't end that well[2] ;)

[1]http://groups.google.com/group/linux.gentoo.dev/msg/28441f08ac1cc20...
[2]http://groups.google.com/group/linux.gentoo.dev/msg/02ea221ca44f705...

Reply Score: 2

B.A.D idea
by theGrump on Tue 16th Jan 2007 03:30 UTC
theGrump
Member since:
2005-11-11

sounds like a lovely way to destroy your system. and are there really that many package management systems in use? from what i can tell, 80% of linux users are likely using a rpm or deb derived package platform. those who aren't really only have one valid counterargument - the desire to build all code locally a la gentoo.

these "generic" packaging projects were and are non-starters, even if they could create a system that makes sense, utilizes your hardware properly (not one size fits all) and does not destroy your install, no one is using them anyway

Browser: ELinks/0.11.1-1.2-debian (textmode; Linux 2.6.18-3-686 i686; 91x34-3)

Reply Score: 4

RE: B.A.D idea
by raynevandunem on Tue 16th Jan 2007 03:38 UTC in reply to "B.A.D idea"
raynevandunem Member since:
2006-11-24

YES, there are.

http://en.wikipedia.org/wiki/Package_management_system#Examples

Furthermore, as in the case with Ubuntu and Debian, a distribution with the exact same PMS as the next distro can still be incompatible with that other distro.

Reply Score: 4

RE[2]: B.A.D idea
by Terracotta on Tue 16th Jan 2007 11:35 UTC in reply to "RE: B.A.D idea"
Terracotta Member since:
2005-08-15

Ubuntu and Debian are the perfect examples showing why decentralised packaging is a BAD idea. If two systems that are so closely related to each other as they are, using the same packaging and installing system, succeed in creating incompatible binary packages, how should a decentralised packaging system solve binary incompatabilities between let's say Red Hat and Debian.

The problem this article about decentralised installation tries to solve is a closed-source problem. If an open-source programmer can't get his open source program into the main tree of distributions, he can provide .deb and .rpm packages. The user can install them quite easily (richt click: install package (in kubuntu and ubuntu that is)), and i.e. apt-get will search for dependencies on the centralised system. It's more work for the maintainer in the beginning (and a lot of work for closed-source programmers to provide .debs and .rpms for all distro's, Opera seems to like this way of working though), but afterwards when distributions start packaging his program he'll get more peer review by actual programmers who compile and support his packages.

This week seems to be about porting problems from the windows world to the FOSS world. First backward compatibility and now binary compatibility. They are only problems to people who want the latest and greatest.

Reply Score: 3

RE[3]: B.A.D idea
by tom1 on Tue 16th Jan 2007 12:49 UTC in reply to "RE[2]: B.A.D idea"
tom1 Member since:
2005-07-12

Ubuntu and Debian are the perfect examples showing why decentralised packaging is a BAD idea. If two systems that are so closely related to each other as they are, using the same packaging and installing system, succeed in creating incompatible binary packages, how should a decentralised packaging system solve binary incompatabilities between let's say Red Hat and Debian.

This is exactly why you need a decentralised system: your example is a centralised system failing to cope with packages from two different distributions! It's quite possible to create a system that would handle this situation fine.

Reply Score: 1

RE[4]: B.A.D idea
by Terracotta on Tue 16th Jan 2007 15:38 UTC in reply to "RE[3]: B.A.D idea"
Terracotta Member since:
2005-08-15

The reason it fails is not because it's a centralised system, it's because you're trying to install a package created for one distribution on another distribution. If that distribution differs too much from the distribution it was created for it might not work. It's perfectly possible to distribute .debs in a decentralised way (opera does it like this). It's a bit more work for opera, but well it's quite easy to install opera on an ubuntu/xandros/mepis/debian/suse... system.
I'd rather see more .deb, or .rpm packages on website than .autopackage packages, for those who insist on installing a program by searching the net and downloading it from a website.

Reply Score: 2

RE[5]: B.A.D idea
by Tom5 on Wed 17th Jan 2007 22:12 UTC in reply to "RE[4]: B.A.D idea"
Tom5 Member since:
2005-09-17

The reason it fails is not because it's a centralised system, it's because you're trying to install a package created for one distribution on another distribution.

Why do you think it fails, then, if it's not the installation system's fault?

I can install user-mode-linux on a Debian system and then install the Ubuntu package inside that. So it's not a hardware limitation.

I can install Ubuntu in a chroot in my Debian system,
and run it from there, so it's not a kernel issue. I can set DISPLAY to the host system, so it's not an X problem.

So, that leaves just libraries and services (daemons). Libraries can be handled as described in the article.

Services (e.g. mysql) are usually designed to run across different computers, and so have a stable protocol.

In fact, the only thing I can think of is D-BUS (a system service which used to change its API regularly). And now it's gone 1.0 that should be fine too.

Reply Score: 1

RE[3]: B.A.D idea
by draethus on Tue 16th Jan 2007 14:49 UTC in reply to "RE[2]: B.A.D idea"
draethus Member since:
2006-08-02

This week seems to be about porting problems from the windows world to the FOSS world. First backward compatibility and now binary compatibility. They are only problems to people who want the latest and greatest.

Maybe that's because on Windows backward compatibility and binary compatibility are solved problems, on Linux they're not.

Write and compile a program on Windows Vista using only functions available in Windows 95, then take it to a Windows 95 box, and it will work.

Now compile on glibc 2.4 using only ANSI C functions, take your program to a box with glibc 2.3 and it will fail to start, even though all the functions you use are available. And that's not even going to horrors like thread-local locales, where it won't even work on another libc of the same version compiled without that option, and Fedora Core 6's GCC's DT_GNU_HASH, where unless you compile with special flags, AFAIK your program doesn't even run on any other distro.

It's not that people want the latest and greatest - currently the only reliable way to get unpackaged software installed is to compile it, and ISVs want to write software once and have it work on every distro for all time.

If you know something I don't, please elaborate.

Reply Score: 3

RE[4]: B.A.D idea
by Terracotta on Tue 16th Jan 2007 15:23 UTC in reply to "RE[3]: B.A.D idea"
Terracotta Member since:
2005-08-15

Well I know for one that backward compatibility is not solved in the windows world. For example Navision (a MS product), can not run on windows Vista.

Backward compatibility has also it's drawbacks in speed: as in being forced to keep all drivers ever created to be supported, i.e. USB-drivers in windows vs the fasted drivers for USB in linux.

Compatability on binary level is disease that has more drawbacks than it is good for and is something that the open source world circumvents quite well: on source level.It should not be introduced in the open source world.

Reply Score: 2

RE[3]: B.A.D idea
by Lambda on Tue 16th Jan 2007 16:51 UTC in reply to "RE[2]: B.A.D idea"
Lambda Member since:
2006-07-28

No, just the opposite. Ubuntu and Debian incompatibilites are perfect examples of a subpar packaging system. Scroll (depending your threaded view) up to find links on the Conary packaging system. Fine-grained versioning would seem to solve these incompatibility problems.

Reply Score: 2

RE: B.A.D idea
by butters on Tue 16th Jan 2007 04:41 UTC in reply to "B.A.D idea"
butters Member since:
2005-07-08

Like the other poster said, there is a difference between package format and package compatibility. You might not realize how similar .rpm and .deb files are to one another. In fact, most of the spec is identical.

The differences lie in package management and package compatibility. For example, we see that rpm has lots of different management systems: yum, urpmi, and yast come to mind. The APT equivalent of rpm is dpkg, but people simply refer to the Debian-derived system as APT because it is more-or-less the de-facto management system for dpkg.

Further, distributions often change the dependencies, add/remove patches, and even change the names of packages they port from other distros to play well on their systems. The name of the Xorg server package, for example, is different on many distributions, and each distro uses different distro-specific patches.

Could this be made simpler? Yes... but the distros hide this complexity from the user. It is mainly more complex for the distributor and its packagers instead of for the end users.

Speaking of which, this whole distributed vs. centralized package management debate represents a tradeoff--shifting work between upstream and the distributor. With distributed packages, the burden is on the upstream developer to get their package working on as many distros as possible. With centralized packaging, the burden is on the distributor to get as many upstream packages to work on their distro.

Guess what? Developers hate packaging! They want their job to be done as soon as their source tree builds and runs properly. Distributors, on the other hand, are essentially packaging machines. Packaging is what they do best. Why not leave things as they are? Let the developers code, and let the distributors package.

Reply Score: 5

RE[2]: B.A.D idea
by de_wizze on Tue 16th Jan 2007 05:10 UTC in reply to "RE: B.A.D idea"
de_wizze Member since:
2005-10-31

Because the Distributors sometimes take long to package. What it should is as simple for developers to package that properly running source tree as taring the folder. Notice that what is described in the article simply generates instructions as to what is provided/needed in terms of files and actions.

Reply Score: 3

RE[3]: B.A.D idea
by archiesteel on Tue 16th Jan 2007 05:34 UTC in reply to "RE[2]: B.A.D idea"
archiesteel Member since:
2005-07-02

Because the Distributors sometimes take long to package. What it should is as simple for developers to package that properly running source tree as taring the folder. Notice that what is described in the article simply generates instructions as to what is provided/needed in terms of files and actions.

There's nothing preventing developers to package their applications using statically-linked binaries *and* having the same apps be packaged by distros at the same time. That way those who want the latest and greatest can download them directly from the developer's web site, and those who prefer to wait for the packages to be available in the repos can do that as well.

You shouldn't have to choose between the two, you should be able to use both as you wish. The only thing I can see being a bit harder is managing system menus, but with freedesktop.org that's not much of an issue anymore.

Reply Score: 3

RE[4]: B.A.D idea
by raynevandunem on Tue 16th Jan 2007 06:33 UTC in reply to "RE[3]: B.A.D idea"
raynevandunem Member since:
2006-11-24

Yeah, but can you install that app from the developer's site with a GUI front-end if you wanted to?

Anybody with command-line prowess can pick either solution if they wanted to, while those who don't, but still want the latest and greatest just as much as the next person, will only have one solution available to them.

And unfortunately, that's the majority of the folks who use desktop distros such as Ubuntu....or at least the target audience that those distros are gunning for.

Reply Score: 1

RE[5]: B.A.D idea
by archiesteel on Tue 16th Jan 2007 07:02 UTC in reply to "RE[4]: B.A.D idea"
archiesteel Member since:
2005-07-02

Yeah, but can you install that app from the developer's site with a GUI front-end if you wanted to?

I'm not sure you read what I posted correctly. I said that developers can provide standalone GUI installers or use on of the other (GUI) methods that use statically-linked libraries. I was not talking about compiling the apps...

but still want the latest and greatest just as much as the next person

See, that's where you get it wrong. Ordinary users don't want the latest and greatest, they want apps that work well. Constantly getting the latest version is something that Windows geeks do (the same hold true ex-Windows geeks using Linux, such as me).

When you say that everyone wants to run the latest and greatest, I believe you are projecting your own preferences onto the average user, and that you are mistaken in doing so.

Mind you, as I said I think developers should release statically-linked GUI installers in addition to tarballs, and let the distro makers update their repositories in all due time.

Also, as I've pointed out many times over the past few days, the Ubuntu repos are *very* up-to-date. If one cannot wait a day or two for an app to be released, then perhaps one needs to get a life... :-)

Reply Score: 4

RE[6]: B.A.D idea
by John Nilsson on Tue 16th Jan 2007 12:01 UTC in reply to "RE[5]: B.A.D idea"
John Nilsson Member since:
2005-07-06

Ubuntus repositorys may be great fo mainstream apps, but as soon as you approach the corners of the known space such as new apps, or apps with very specific target groups the qualitu of service quickly drops.

Reply Score: 2

RE[7]: B.A.D idea
by archiesteel on Tue 16th Jan 2007 16:03 UTC in reply to "RE[6]: B.A.D idea"
archiesteel Member since:
2005-07-02

Yes, which is why I say that developers of such apps should use one of the statically-linked, distro-neutral packaging options at their disposal.

Reply Score: 2

RE[6]: B.A.D idea
by tom1 on Tue 16th Jan 2007 13:02 UTC in reply to "RE[5]: B.A.D idea"
tom1 Member since:
2005-07-12

See, that's where you get it wrong. Ordinary users don't want the latest and greatest,

Not as a rule, no. But when they ask for help "Evolution crashes when I try to read my mail from {odd mail server}!" they get told "Try version X - it's fixed there".

Should the distribution package the new barely-tested version? No. Most users want the stable version. But the user experiencing the crashes wants the new version.

Reply Score: 2

RE[7]: B.A.D idea
by archiesteel on Tue 16th Jan 2007 16:10 UTC in reply to "RE[6]: B.A.D idea"
archiesteel Member since:
2005-07-02

That's a good point. I guess that, with the repository system, it ultimately depends on how fast the packagers are at implementing bugfix versions into the main distro. So far with Ubuntu I've been lucky...

Reply Score: 2

RE[6]: B.A.D idea
by Lambda on Tue 16th Jan 2007 16:55 UTC in reply to "RE[5]: B.A.D idea"
Lambda Member since:
2006-07-28

See, that's where you get it wrong. Ordinary users don't want the latest and greatest, they want apps that work well.

And see, that's where you get it wrong. The latest and greatest "developer release" tends to be better than the previous version. And the latest and greatest developer release tends to lag behind the repository version.

Hell, in Ubuntu you're in this bizarro world of various degrees of instability before a real release, and then nothing else besides security updates.

Reply Score: 3

RE[7]: B.A.D idea
by archiesteel on Tue 16th Jan 2007 19:59 UTC in reply to "RE[6]: B.A.D idea"
archiesteel Member since:
2005-07-02

And see, that's where you get it wrong. The latest and greatest "developer release" tends to be better than the previous version.

Yes, but often (in the Linux world) those "latest and greatest" are beta versions, which may introduce breakage, especially if they depend on lots of other packages.

And the latest and greatest developer release tends to lag behind the repository version.

I think you probably meant the opposite...

Again, the amount of lag varies. For me (someone who likes to try out new versions), I find that with Ubuntu the delays are acceptable.

Hell, in Ubuntu you're in this bizarro world of various degrees of instability before a real release, and then nothing else besides security updates.

Not so. Just add the "Backports" repository to Synaptic, and you'll get newer versions of apps for your stable distro.

And there are not "various degrees of instability". There is only the current release and the next (unstable) release. I think you're confusing Ubuntu with Debian here...

Reply Score: 3

RE[2]: B.A.D idea
by draethus on Tue 16th Jan 2007 09:22 UTC in reply to "RE: B.A.D idea"
draethus Member since:
2006-08-02

Guess what? Developers hate packaging!

Yes, mostly because they have to build 1 package per version of each distro. On Windows, they build one package alltogether.

Distributors, on the other hand, are essentially packaging machines. Packaging is what they do best.

Not by a long shot. A frightening amount of packaging is done by people who don't know what they're doing. Wine, for example, comes with wineserver, a per-user server that handles things like inter-process synchronization, which is started on-demand by wine and exits when wine itself exits. A while back, some distro put wineserver in an initscript (it's a server, right?) and tried to run it on startup, as root!

Like autopackage put it, it makes no more sense for a distro to do packaging than for them to do artwork and GUI design for the app.

Why not leave things as they are? Let the developers code, and let the distributors package.

Developers need feedback from users, and getting feedback from a version you released 6-12 months ago is worse than useless.

Distributors produce one package for one version of one distro - an absolute vendor lock-in.

Distributors have to put in extra work packaging, testing, dealing with bug reports - and that's for each distro. Tonnes of wasted effort being the middle man.

Making a DEB or RPM is an absolute waste of your time - it works today but not tomorrow, it works on this box but not on that one.

The sad thing is, solutions existed for years now, and end users love them (autopackage website gets 500-1000 hits per day), but distros would rather die than implement them.

Reply Score: 5

RE[2]: B.A.D idea
by John Nilsson on Tue 16th Jan 2007 11:56 UTC in reply to "RE: B.A.D idea"
John Nilsson Member since:
2005-07-06

Another approach is to work on a protopackage-format. A standard which sole purpose is to do as much distroindipendent work as possible upstream.

There are a few of those today. We have the autotools, ./configure; make; make install routinte, but it's arguably not very maintainable in the longrun, and not that userfriendly.
Debian is turning into a protodistro.
Gentoo was a "meta"-distribution from the start.

So instead of focusing all effort on how to desing packagesystems to bypass distroefforts (autopackage, klick, zero install, what have you) the effort should be spent on protopackagesystems that makes the lives easier for the user/developer (prosumer) AND distributer.

Reply Score: 3

RE[2]: B.A.D idea
by skroob on Tue 16th Jan 2007 20:11 UTC in reply to "RE: B.A.D idea"
skroob Member since:
2007-01-16

> Guess what? Developers hate packaging! They want their
> job to be done as soon as their source tree builds and
> runs properly. Distributors, on the other hand, are
> essentially packaging machines. Packaging is what they do best.

Often the problem is the packagers. They pack the stuff
without any feedback to the developers. First time you
know there is a package for distro $foo is when sombody
sends you a mail about a bug in the package.

Another thing is the delay. When I release something
i want my users to have the new version ASP. Not when
they buy the next CD or hurd is ready.

Reply Score: 2

The new wheel is still round.
by FishB8 on Tue 16th Jan 2007 03:53 UTC
FishB8
Member since:
2006-01-16

These installers do the same thing as the centralized systems, but with all the voodoo contained within the file itself instead of comming from a centralized system.

There are two distince disadvantages to this approach:

1) the files are "distro neutral" which just screams "staticly linked"

2) since they are self contained, there is no centralize means to update them if bugs are found, or upstream dependencies change.


Re-inventing the whell is great, but the wheel has to be significantly better in order to convince people to use it. A distro specific solution offers all kinds of advantages that "distro neutral" solutions just can't compete with.

Reply Score: 3

RE: The new wheel is still round.
by Tom5 on Tue 16th Jan 2007 08:25 UTC in reply to "The new wheel is still round."
Tom5 Member since:
2005-09-17

1) the files are "distro neutral" which just screams "staticly linked"

Not necessarily. The article has a whole section on handling shared libraries:

http://osnews.com/story.php/16956/Decentralised-Installation-System...

since they are self contained, there is no centralize means to update them if bugs are found, or upstream dependencies change.

That doesn't follow at all. Zero Install, for example, lets you specify any number of 'feeds' for a single program. One might be 'upstream', another might be your distribution's security team.

Reply Score: 2

What Linux really needs at this point...
by mnem0 on Tue 16th Jan 2007 09:39 UTC
mnem0
Member since:
2006-03-23

What Linux really needs at this point is package generation built into the big IDEs. For instance MonoDevelop and Eclipse needs a plugin each that will let the program select a number of code projects in the workspace. Then, when you build this "packaging project" in the IDE it will generate .rpm, source.rpm, .deb and .tar.gz packages automatically.

If it's THAT easy to generate all the different types of packages then the devs will actually generate all packages. Instead of what we have today, were the big projects (mono etc) typically provides their stuff in all the major packaging formats, while the smaller devs choose their favorite format (and ignore the rest).

Once all projects publish their stuff in (more of less) all the major packaing formats, distros (or users themselves) can stop by at the project website and pickup the package for testing or install/usage.

This "auto packaging" plugin should also generate .msi files for Windows (and whatever Mac uses). Atleast this should be an *option* for Mono and Eclipse which typically generate cross-platform programs. Then if someone creates a Linux-only program with Mono (or hates Windows) he will just not check the "Generate .MSI package" checkbox.

The key is: make it ridiculously easy for the devs to generate top-notch packages.

Reply Score: 2

static linking
by prymitive on Tue 16th Jan 2007 09:48 UTC
prymitive
Member since:
2006-11-20

static linking of libs or apps will only make users complain about kde apps not using kde styles becouse they are linked to a different qt version, and that is only one example.

Reply Score: 3

Make sure this article is read
by Fergy on Tue 16th Jan 2007 11:07 UTC
Fergy
Member since:
2006-04-10

To all the people commenting on this article. If you want to maximize the amount of people that will read this article you should Digg it. If you don't have an account with digg, get it.

Reply Score: 1

Multiple versions for multiple users
by anda_skoa on Tue 16th Jan 2007 11:22 UTC
anda_skoa
Member since:
2005-07-07

While the article's example of Alice and Bob having different GIMP versions installed is quite cool, I am wondering how this works when a new user account is created afterwards.

Say an account for user Diane is created after Alice installed GIMP 2.2 and Bob installed GIMP 2.4

Will Diane get 2.4?

What if Bob upgrades to 2.6? Will Diane keep 2.4 or be upgraded with Bob?

Assuming she will be upgraded as this would seem to be usually expected: how can she tell to remain on 2.4? Does she have to explicitly install 2.4 herself, even if it is already installed?

Reply Score: 2

tom1 Member since:
2005-07-12

While the article's example of Alice and Bob having different GIMP versions installed is quite cool, I am wondering how this works when a new user account is created afterwards.

Say an account for user Diane is created after Alice installed GIMP 2.2 and Bob installed GIMP 2.4

Will Diane get 2.4?


Assuming the sys admin hasn't set up anything, Diane doesn't get any version:

diane $ gimp
command not found: gimp

If she wants it, she goes to gimp.org and tells the computer "Run that program, Gimp 2.4". At that point, the system notices it is already present and runs the version Bob's added directly without downloading another copy.

Reply Score: 2

anda_skoa Member since:
2005-07-07

If she wants it, she goes to gimp.org and tells the computer "Run that program, Gimp 2.4". At that point, the system notices it is already present and runs the version Bob's added directly without downloading another copy.

I see, makes sense since both installed Gimp versions have been installed by individual users.

Can a person with elevated rights, e.g. the administrator, install system wide software, i.e. something all users see as installed?

I mean using zero install, obviously they can still use the system's package manager

Reply Score: 2

tom1 Member since:
2005-07-12

Can a person with elevated rights, e.g. the administrator, install system wide software, i.e. something all users see as installed?

I mean using zero install, obviously they can still use the system's package manager


Sure, the admin just creates a launcher as /usr/local/bin/gimp (man 0alias shows how to create such a short-cut).

Or, the admin can add it to the system-wide defaults the Applications menu (how is desktop specific).

Reply Score: 2

anda_skoa Member since:
2005-07-07

Sure, the admin just creates a launcher as /usr/local/bin/gimp (man 0alias shows how to create such a short-cut).

Ah, very good.

Or, the admin can add it to the system-wide defaults the Applications menu (how is desktop specific).

Well, no. Any current desktop implements the freedesktop.org menu specification.

One can use xdg-desktop-menu to handle the installation of the .desktop file to make sure it is installed correctly (even for intentionally incompatible distributions like Mandriva)

Assuming one has, as with the application launcher, a cross-desktop specification for things like MIME type registration, icon sets, etc., can 0install handled this automatically (provided there are available tools the xdg-utils) or is this always a manual step?

Reply Score: 2

Great article, but...
by Moochman on Tue 16th Jan 2007 12:10 UTC
Moochman
Member since:
2005-07-06

This article was really well thought-out and delivered--it's not just some publicity piece. It's clear that the developer/author has taken a lot of time to think about installation and use cases, and is making the next generation of Linux installation technologies a reality. I especially like the idea that different versions of libraries should be matched to different versions of programs, albeit without the needlessly inefficient app-folder method. I wish the author the best of luck, and hope that Zero Install catches on!

However, one flaw I see in your implementation is the cryptographically-derived naming of folders. In the beginning of the article, you point out that non-hash-derived identifyers are much more easily user-readable, yet later on you claim that end-user "Alice" will be willing to go to the Gimp homepage, look up the appropriate hash and compare it to hash-name of the folder that "Bob" installed on the hard drive. Yeah, right! Not only does that sound like the exact opposite of the user-friendly goals you set out with, but it's also incredibly insecure to assume user vigilance as a means of security! All the hash-naming of folders would serve to do is make the end user more confused.

Likewise, the certificate verification dialogue box doesn't seem too user-comprehensible or foolproof--especially considering that the user is told the database is "Unreliable"! A whiteboarding system (either independent or distro-specific) would be much more reliable, but of course then the sytem is practically as centralized as was supposed to be avoided!

It seems as though achieving ease-of-use, decentralization, and security all at once really is an elusive goal...

Reply Score: 3

RE: Great article, but...
by tom1 on Tue 16th Jan 2007 12:38 UTC in reply to "Great article, but..."
tom1 Member since:
2005-07-12

However, one flaw I see in your implementation is the cryptographically-derived naming of folders. In the beginning of the article, you point out that non-hash-derived identifyers are much more easily user-readable, yet later on you claim that end-user "Alice" will be willing to go to the Gimp homepage, look up the appropriate hash and compare it to hash-name of the folder that "Bob" installed on the hard drive. Yeah, right!

I should have been clearer: the installation system does this on behalf of Alice. It gets the hash from the XML file describing the Gimp; all Alice has to do is find the link to the XML file.

Likewise, the certificate verification dialogue box doesn't seem too user-comprehensible or foolproof--especially considering that the user is told the database is "Unreliable"!

Right. Ideally, there should be multiple feeds for this information. Currently, there's only mine, which is "unreliable" because I don't have the resources to check out people's keys or offer any compensation if I'm wrong.

This is certainly an area where a commercial company could add value, but without having to start their own distribution (as they'd have to do now).

Reply Score: 2

RE[2]: Great article, but...
by Moochman on Tue 16th Jan 2007 22:17 UTC in reply to "RE: Great article, but..."
Moochman Member since:
2005-07-06

I should have been clearer: the installation system does this on behalf of Alice. It gets the hash from the XML file describing the Gimp; all Alice has to do is find the link to the XML file.

Aha ok. Although I still feel like having some sort of real-name identifier there would help the administrator find the folder in case of a problem. Why can't the directory name be the program name (as opposed to the URL)--i.e. gimp-2.3-- so it can be installed off of CD? I don't understand what about making the folder a hash makes it more secure than, say, storing the hash information in a separate protected file within the program's folder.

Right. Ideally, there should be multiple feeds for this information. Currently, there's only mine, which is "unreliable" because I don't have the resources to check out people's keys or offer any compensation if I'm wrong.

Aha. Cleared up, although as it is now it's clearly not a long-term solution. My only gripe with the model is that in an office setting, users might not be able to install any software they wanted, since they would be probably be locked into a list of software on a predefined whiteboard server. However, I suppose that the problem of "missing software" would probably be much scarcer than in today's repository model, since hosting such a whiteboard server that certifies URLs rather than individual versions of programs is undoubtedly much easier than having to package and test every new iteration of software by hand.

Reply Score: 1

RE[3]: Great article, but...
by Tom5 on Wed 17th Jan 2007 21:33 UTC in reply to "RE[2]: Great article, but..."
Tom5 Member since:
2005-09-17

Why can't the directory name be the program name (as opposed to the URL)--i.e. gimp-2.3-- so it can be installed off of CD? I don't understand what about making the folder a hash makes it more secure than, say, storing the hash information in a separate protected file within the program's folder.

OK, so Alice puts a CD in the drive with 'gimp-2.3.tgz' on it. How can the system know that it's genuine? Anyone could make such a CD. The system can't tell, so it can't share it with Bob.

But, if the CD contains an archive called 'sha256=XYZ.tgz', then it can be put in the shared directory. The system can see that it's correct just by comparing the archive's contents to its name.

Reply Score: 1

RE[4]: Great article, but...
by Moochman on Thu 18th Jan 2007 08:03 UTC in reply to "RE[3]: Great article, but..."
Moochman Member since:
2005-07-06

Hmm... while your solution does sound perhaps a bit more elegant, I still don't see why the system couldn't extract an identifier text file from the archive and then compare it to the archive's contents. Also, just to be clear: comparing the hash to the contents wouldn't do a thing to ensure security all by itself; it would also need to be compared to the hash at the project's website, right? Otherwise anyone could create malware and provide a hash to match it, but make it look like normal software. Furthermore, don't the archive contents have to be re-analyzed every time you want to verify their authenticity? So given that the website needs to be accessed and the hash needs to be recalculated in any case, couldn't we just skip the step with the local copy of the hash?

To rephrase: The hash being stored in the local filesystem does very little to ensure integrity of the program. Only by checking the folder's contents against an online hash of its contents can ensure the program's security, which effectively renders the local copy of the hash useless.

Or am I missing something?

Reply Score: 2

RE[5]: Great article, but...
by Tom5 on Thu 18th Jan 2007 17:52 UTC in reply to "RE[4]: Great article, but..."
Tom5 Member since:
2005-09-17

Otherwise anyone could create malware and provide a hash to match it, but make it look like normal software.

Yes, just because something is in the shared directory doesn't mean it's safe to run it. One reason why unfriendly names are OK here is that you really don't want users browsing around running things that just look interesting!

Furthermore, don't the archive contents have to be re-analyzed every time you want to verify their authenticity?

No, that's why you have the privileged helper. It checks the digest once and then adds it. So, if you see a directory called:

/shared-directory/sha256=XXXXXXX

then you don't have to calculate the XXXXXXX bit yourself. If it didn't match, it wouldn't have been allowed in.

BTW, you don't need to use the web to check the hash. It may be that Alice and Bob both trust the CD (in which case they get to share the copy on it). Denise doesn't trust the CD, so she checks with the web-site instead (and will share the copy only if it matches).

Reply Score: 1

RE[6]: Great article, but...
by Moochman on Thu 18th Jan 2007 22:30 UTC in reply to "RE[5]: Great article, but..."
Moochman Member since:
2005-07-06

>>No, that's why you have the privileged helper. It checks the digest once and then adds it. So, if you see a directory called:

/shared-directory/sha256=XXXXXXX

then you don't have to calculate the XXXXXXX bit yourself. If it didn't match, it wouldn't have been allowed in. <<


Is that "sha256" there before the XXXXXXX the name of the program? If so, then my fears are allayed--the name is user-readable and the user could find the program without relying on a third-party program/database.

However, I don't get the point of talking about how "this is safe if you see the hash" (or even, "this is safe if the system sees the hash"), since you say the "privileged helper" is supposed to prevent unsafe items from being in there in the first place. So malware shouldn't be able to find its way in, period, regardless of whether it has a hash directory name or not.

Which actually seems to contradict the following statement:

>>Yes, just because something is in the shared directory doesn't mean it's safe to run it.

What does that imply--that the privileged helper really isn't that effective after all? If the shared folder isn't secure, what's to prevent malware from just being copied in there, in a folder named after a hash that was generated using the same public algorithm you make available to all software publishers?

**Unless**.... every user had a unique encryption key for their individual system, so hash files generated on the system were unable to be generated ahead of time by malware authors. That would be a very secure mechanism. In that case you'd need two separate hashes--one stored online (or on CD), that is checked at install time to verify the software, and a different one on the local system, encrypted using the local personal encryption key, which shows that the file has been verified once already. That would effectively be spoof-proof, since a simple copy procedure would never be able to imitate a true Zero Install-based installation of a program. And if the user were to somehow lose their local encryption key through an OS upgrade or some such thing, no problem, they can just reinstall the software using a new key. (Actually, this kind of bears a remarkable similarity to evil system-bound DRM schemes, but it serves a much less evil purpose, since in this case it's not locking the user out of installing the software they want, it's only locking out the software they never asked for.)

>>One reason why unfriendly names are OK here is that you really don't want users browsing around running things that just look interesting!

Security by obscurity doesn't seem like much of an answer. I'd much prefer to keep my directory tree user-navigable over turning it into the filesystem equivalent of the Windows registry.

And I still can't think of a reason that any system would require the hash to be stored in the directory name.

Reply Score: 2

RE[7]: Great article, but...
by Tom5 on Sat 20th Jan 2007 11:08 UTC in reply to "RE[6]: Great article, but..."
Tom5 Member since:
2005-09-17

you say the "privileged helper" is supposed to prevent unsafe items from being in there in the first place.

No, I said that things can only have their real name. 'unsafe' is a subjective term; you can't expect the computer to enforce the rule "No unsafe software to be installed in this directory". Different users might even disagree on whether something is unsafe.

If the shared folder isn't secure, what's to prevent malware from just being copied in there, in a folder named after a hash that was generated using the same public algorithm you make available to all software publishers?

OK, take ROX-Filer version 2.5 (Linux x86 binary) for example. It has this hash (and, therefore, directory name):

sha1=d22a35871bad157e32aa169e3f4feaa8d902fdf2

You're quite free to change it in some way and add your malicous version to the shared directory too. BUT, changing it will change the hash so your evil version might be called:

sha1=94fd763dfe509277763df47c53c38fc52866eaf4

You can't make your version appear under the original's name, because the name depends on its contents.

And I still can't think of a reason that any system would require the hash to be stored in the directory name.

It depends what you want it for. I think you are thinking about this scenario:

"Alice is bored. She wants to run something, so she has a look in the shared directory to see what other users have been running. Noticing a directory called 'gimp-2.4', she decides to run it, first checking at gimp.org that its hash matches the one on the web-site."

That works fine with the hash inside the directory, but it's not the case Zero Install is aimed at. Here's our target scenario:

"Alice needs to edit some photos. She goes to gimp.org and asks to run Gimp 2.4. She (her software) looks for an existing directory with the same hash and finds the copy Bob installed earlier."

Notice that the question we're trying to answer isn't "does this directory have hash XXX?", but "where is the directory with hash XXX?"

Reply Score: 1

BAD idea or not
by ameasures on Tue 16th Jan 2007 13:02 UTC
ameasures
Member since:
2006-01-09

Many years ago my first Linux was FT-Linux (and I regret
losing the CD). It installed a base system and whatever packages it was instructed to. HOWEVER it also ran all the other stuff on the CD from the CD with some cacheing.

FT was great to live with, right from the install. Having just installed Ubuntu this week, I am forever going back to Synaptic to pull in one tool or another.

Saying it is a BAD idea because it is difficult makes no sense to me. Humans apply ingenuity to difficult problems to invent solutions - thats what we do!

Let's see it in practice and then decide!

Reply Score: 2

RE[2]: yay!
by twenex on Tue 16th Jan 2007 13:16 UTC
twenex
Member since:
2006-04-21

I couldn't agree more. These days I laugh when people say such-and-such software is "intuitive" or "unintuitive", especially if such software is Windows- or Mac-based. To add to your examples, what's "intuitive" about dragging a disc icon to a trashcan to eject it (it should delete all files/quick format/fully format the disc) or dragging an icon to an /Apps folder to install it? (To a Unix user, if the "app" (binary) is dragged to /Apps, then the libraries should be dragged to /Libraries, etc). As a Gentoo user - you're right, as a Gentoo user the "intuitive" way of installing foo is to emerge foo.

Reply Score: 3

RE[3]: yay!
by Morin on Tue 16th Jan 2007 13:40 UTC in reply to "RE[2]: yay!"
Morin Member since:
2005-12-31

> These days I laugh when people say such-and-such software is
> "intuitive" or "unintuitive", especially if such software is Windows- or
> Mac-based. To add to your examples, what's "intuitive" about dragging
> a disc icon to a trashcan to eject it (it should delete all files/quick
> format/fully format the disc)

I agree with that, but let me add that next to each ejectable drive icon in Finder there is an eject button (labeled with the same symbol as the eject button on VCRs). I always use this button because I'd call it intuitive (and thus easy to remember), while the trashcan gesture is non-intuitive (I didn't even remember it until now). What is still non-intuitive about the eject button is that it is located in Finder (or rather, that disks appear at several points in the UI at all).

> or dragging an icon to an /Apps folder to install it?

Why, that sounds very reasonable to me.

> (To a Unix user, if the "app" (binary) is dragged to /Apps, then the
> libraries should be dragged to /Libraries, etc).

I didn't have to install additional libraries on my Mac yet, but that's exactly what I had thought I'd had to do. At least that sounds most reasonable to me.

> As a Gentoo user - you're right, as a Gentoo user the "intuitive" way
> of installing foo is to emerge foo.

In "my dream OS", the package manager is no magic piece of voodoo, but simply a front-end to organize /Applications and /Libraries (to manage the vast amount of stuff you'd find there), and to access online repositories easily, to download and verify packages and move them to those folders. In other words: a tool, and not a wizard.

Reply Score: 3

RE[3]: yay!
by Moochman on Tue 16th Jan 2007 22:47 UTC in reply to "RE[2]: yay!"
Moochman Member since:
2005-07-06

You have a point--"intuitive" has a lot to do with what you're used to. And (perhaps unfortunately) most people are used to Windows and/or Macs, which means that borrowing interface elements from those OSes will result in a higher percentage of the general population being able to find their way around. Hence, the elements that make up Windows and Mac interfaces are more "intuitive" for the general population.

Since the usage case described seems to be targeting desktop-Linux end-users in office environments, I'd imagine that they wouldn't want to make the interface *too* unfamiliar. That said, the Windows paradigm of having to download an executable, then find it, double-click it and click through "Next" a bunch of times is hardly what I would call a good interface in anyone's book.

Reply Score: 3

RE: Make sure this article is read
by twenex on Tue 16th Jan 2007 13:23 UTC
twenex
Member since:
2006-04-21

If you don't have an account with digg, get it.

Jawohl, mein Kommandant.

(That, btw, is German for "no, thankyou.")

Reply Score: 3

Excellent
by siki_miki on Tue 16th Jan 2007 13:55 UTC
siki_miki
Member since:
2006-01-17

I agree with most of article points.
However I feel free to note some outstanding issues I'd like to see solved (don't know how much of it it is supported or not though in 0install):

-System dependencies:
What if app requires a specific daemon version (with specific protocol version), which is single & root on the system and that (version) isn't provided by the distro? Answer would be to avoid this kind of dependency or advice packagers not to target too high with version dependency. Useful is to have a way to try to fool program so that it at least installs and tries to rtun with older version. Area I even fear to mention are kernel features, init & udev scripts, etc. Probably this kind of packages is better handled by distro-specific packaging.

-Scripts are an issue. Obviously autotools' or deb/rpm scripts can't be used directly as they can mess up the system. They need to be modified to create specific temporary environment in which zero-installed app is supposed to run. Same goes for library environment, so when app which needs specific library is run, it should also execute specific script possibly required to set up the environment for that version of library (SDL env. variables as an example).

-Configuration: user should be able (via a friendly GUI) to modify which lib version is default for running his app with (whether it fails or doesn't is another problem). He/she should also be able to select 'variant' of a package he wants, as provided by different authors, such as distros, contributors or original code authors. It should also be possible to set defaults for which variant automatic installer will favour, including blacklist & whitelist support. Another wanted ability is to have specific tweaked lib/app version for specific distro, if 'vanilla' code is known to cause compatibility problems.

-Source. Users need ability to have custom compiled and installed software to his home directory, but which is registered into the package databases, or in 'overloading' local database. Perhaps autotools & co. should support this for ease of use.

-Local untrusted install. If user wants to have a version of software/lib from source which is new/unknown, he should be able to install it into his local home directory. Once root starts trusting this source, he can move it to shared repo.

-Binary compatibility. Another big topic which can make cross-distro compatibility harder. Autpoackager site has some interesting info about it.

-Optional dependencies. This needs support from multiple parties: linker support, packaging system support, and finally properly designed applications which will not malfunction if "optional" library isn'r present, but scale down their functionallity.

-Multiarch. although not mentioned, I hope x86 on x86-64 and packages with special compile options (mmx, sse,2,3 etc) are supported concurrently by zero-install.

All together, this system seems as best of a few devised to solve linux packaging fragmentation (and dependency hell). I certainly hope that distributions will support it as an alternative install system, as it shouldn't be too invasive addition. Certainly sounds like a good way to distribute newer app versions to older distributions, as well as various third-party stuff which isn't in distro repositories (perhaps even commercial sw). Not that deb & rpm are bad, but they don't solve many of above issues.

Edited 2007-01-16 14:00

Reply Score: 3

RE: Excellent
by Tom5 on Wed 17th Jan 2007 22:24 UTC in reply to "Excellent"
Tom5 Member since:
2005-09-17

Source. Users need ability to have custom compiled and installed software to his home directory, but which is registered into the package databases, or in 'overloading' local database. Perhaps autotools & co. should support this for ease of use.

Exactly. This is what the 'Register' button does in the compile screenshots:

http://rox.sourceforge.net/desktop/node/360

-Multiarch. although not mentioned, I hope x86 on x86-64 and packages with special compile options (mmx, sse,2,3 etc) are supported concurrently by zero-install.

It currently just uses the machine name and OS from uname. You could easily get it to prefer '-mmx' binaries if available, though (edit arch.py; there's a table listing compatible architectures in order of preference for the current machine type).

Reply Score: 1

RE[4]: yay!
by twenex on Tue 16th Jan 2007 14:39 UTC
twenex
Member since:
2006-04-21

I agree with that, but let me add that next to each ejectable drive icon in Finder there is an eject button (labeled with the same symbol as the eject button on VCRs). I always use this button because I'd call it intuitive (and thus easy to remember), while the trashcan gesture is non-intuitive (I didn't even remember it until now). What is still non-intuitive about the eject button is that it is located in Finder (or rather, that disks appear at several points in the UI at all).

Well, the last time I used a Mac, which was admittedly several years ago, the "eject disk" option in the Special menu caused the computer to immediately eject the disk, then immediately ask for it back - and this is the days when Macs, in the interests of "making computers behave like an appliance" were completely closed - no expansion cards, no nothing.

In "my dream OS", the package manager is no magic piece of voodoo, but simply a front-end to organize /Applications and /Libraries (to manage the vast amount of stuff you'd find there), and to access online repositories easily, to download and verify packages and move them to those folders. In other words: a tool, and not a wizard.

The point I was making is that, if I didn't know what went in to making software, /Apps would be just as much "a magic piece of voodoo" to me as you seem to imply "emerge foo" is to you - no more, no less. I agree that it would be nice if you could drag stuff to /Apps and have it place stuff in /Libraries, etc. - or, if already installed, not - but I don't see any inherent advantage to this over emerge or, perhaps, graphical frontends to emerge. Not better, just different.

Reply Score: 3

Bingo
by Lambda on Tue 16th Jan 2007 16:39 UTC
Lambda
Member since:
2006-07-28

Do we need this many people working on essentially the same task? Are they bringing any real value? Without the Fedora packager, Fedora users wouldn't be able to install Inkscape easily, of course, so in that sense they are being useful. But if the main Inkscape developers were able to provide a package that worked on all distributions, providing all the same upgrade and management features, then we wouldn't need all these packagers.

I haven't read the whole article yet, but I can see the author is getting at what I've said on OSnews numerous times. Distros are a form of lock-in, and more importantly a suboptimal allocation of scarce developer (package maintainer) resources.

Reply Score: 4

RE: Bingo
by Finalzone on Tue 16th Jan 2007 20:00 UTC in reply to "Bingo"
Finalzone Member since:
2005-07-06

I have to disagree on that statement for one reason: first, source package is available on most distributions which provide a spec file (used to build these binary packages) and the tarball from these upstream developers (Inkscape in this example).
Second, this package needs to be reviewed for validating the license, check out security (key word), make sure the installation goes to the right path and more. Since each distribution has a different criteria for packaging a application, having another package manager from upstream like autopackage won't help solve these above issues.

Reply Score: 2

RE[2]: Bingo
by Lambda on Wed 17th Jan 2007 08:16 UTC in reply to "RE: Bingo"
Lambda Member since:
2006-07-28

Second, this package needs to be reviewed for validating the license

No it doesn't. The distro only worries about licenses for software it distributes. They would be out of the loop.

Reply Score: 2

bravo
by 25bravo on Tue 16th Jan 2007 19:24 UTC
25bravo
Member since:
2006-01-04

I'm all for this kind of solution. I hope it expands to be more robust. It would be nice if it could integrate into the update managers of various distros. It would also be nice if it integrated bittorrent.

All we need now is for a distro to use it as default. The big guys, like Ubuntu, probably won't cave in. But a new kid on the block could probably implement it. Imagine how much smaller a distro team would need to be, if they outsourced the packaging to the devs.

But one question hanging in my mind is compatibility conflicts with different kernels. Alice couldn't use two different programs that rely on two different kernels, at the same time, could she? Or, take VMWare for instance, what if specific kernel modules need to be installed? User-mode-linux?

I personally think that linux needs to be more modularized. Drivers need to be pushed into user-space. But, in the mean-time, how can something like 0install be a full end-to-end solution? Would there have to be at least _some_ alternative package manager to handle system critical software?

Reply Score: 2

RE: bravo
by Moochman on Tue 16th Jan 2007 22:24 UTC in reply to "bravo"
Moochman Member since:
2005-07-06

All we need now is for a distro to use it as default. The big guys, like Ubuntu, probably won't cave in. But a new kid on the block could probably implement it. Imagine how much smaller a distro team would need to be, if they outsourced the packaging to the devs.

Ulteo?

I personally think that linux needs to be more modularized. Drivers need to be pushed into user-space.

Agreed, but I doubt it'll happen anytime soon, given the sentiments of the kernel devs. Actually, I wouldn't care about the drivers being kernel-space, if it weren't for the fact that it makes installing them much more complicated than it should be.... (For instance, why do you think people so often revert to ndiswrapper, even for wireless chipsets that Linux is "supposed to" support?)

But, in the mean-time, how can something like 0install be a full end-to-end solution? Would there have to be at least _some_ alternative package manager to handle system critical software?

I'd say the answer to that is a near-indisputable yes.

Edited 2007-01-16 22:34

Reply Score: 2

the problem is...
by Obscurus on Wed 17th Jan 2007 00:02 UTC
Obscurus
Member since:
2006-04-20

... that every single distribution of Linux is effectively a unique Operating System - it would be like having several hundred versions of Windows or OSX, each with varying levels of binary compatibility. This makes it very difficult for developers, especially those who have valid reasons for keeping their source code closed, to write software for Linux. This is why most commercial software written for Linux is written for Red Hat, Suse or occasionally Debian - as far as commercial software houses are concerned, there are no other distros, and for most, Red Hat is the only distro (Red Hat=Linux, all other distros != Linux in their view).

Package management tools like APT are a good idea for managing core OS updates, but a really bad idea for the installation of apps.

This is one area where Windows gets it pretty right - the OS comes bundled with very few applications, and the apps it does come with are all written and maintained by the vendor. MS manages the core OS and these packages with its own package manager (Windows Update). Many apps you install on windows have their own automatic update features etc..

Where Linux distros get it wrong is by bundling sometimes thousands of apps with the OS, and then trying to manage them, which is simply impossible to do properly in a centralised way. It also has the potential to backfire on developers and users alike - say you install Gimp for the first time on one distro and it is horribly buggy, due to poor packaging. Your first impression of the program is not very good, and most people don't give things another chance, so that user is going to move on to something else, even though the original developer had nothing to do with the bugs that were created by a poorly packaged distro. Thereby one program gets a bad rap because of one distro. The ISV should shoulder the responsibility for the app, and get the kudos as well, not the distributor/packager).

Linux distros need to stop trying to bundle 374 types of kitchen sink in the distro, and instead focus on shipping a small, stable, compact Operating System with a limited set of basic software (browser, media player, text editor, file manager, image viewer and perhaps an email client), and create a stable binary platform for ISVs to be able to simply make one package that will work for all distros.

Until that happens, Linux is unlikely to make its way onto a much bigger % of PC than at present, simply due to the fragmentation and confusion generated by having too many distros, too many apps that basically do the same thing in slightly different way.

For me, the ideal Linux distro is one that comes with a limited set of core functionality, an uncluttered, simple but powerful and tightly integrated GUI (eg XFCE, which I am very fond of), and the ability to install signed drivers and software from the Vendor's official website. It should not take up more than a couple of hundred megabytes.

I like to have a single program that has a comprehensive feature set that does one thing well, rather than using 15 different programs to perfrom only part of a task at a time.

Since I rarely have more than a dozen or so apps installed over my base OS (at this point on WinXP as my main OS), I don't need any tools to manage my installed software, and since I use tools with comprehensive functionality, I don't forsee any need to change this by installing hundreds of little bits and pieces.

Linux as a whole needs to simplify, rationalise and become more organised and integrated to really take off in the way many of us would like it to. Linux the OS needs to separate itself from the apps that run on it, and at the same time create a stable platform for ISVs to create software that can be distributed separately in binary (or source) form and Just Work® on any distro.

Reply Score: 2

RE: the problem is...
by twenex on Wed 17th Jan 2007 01:12 UTC
twenex
Member since:
2006-04-21

Oh boy. Where to start?

"[The problem is]... that every single distribution of Linux is effectively a unique Operating System - it would be like having several hundred versions of Windows or OSX, each with varying levels of binary compatibility.

Er, except not. The differences between distros of Lovely Linux that use the same package managers (and most use only one of three or four package managers) are an order of magnitude smaller than the differences between Wonderful Windows versions or Mac OS <9 and Mac OS X (Can't speak for versions of OS X as I've not used many).

This makes it very difficult for developers, especially those who have valid reasons for keeping their source code closed, to write software for Linux.

I've not seen a valid reason for keeping source closed yet, unless maybe it's national-security-related. (And even then, as in the recent spats between the US and UK over their fighter-jet software, some degree of openness may be required). If you can introduce me to one, however...

This is why most commercial software written for Linux is written for Red Hat, Suse or occasionally Debian - as far as commercial software houses are concerned, there are no other distros, and for most, Red Hat is the only distro (Red Hat=Linux, all other distros != Linux in their view).

Maybe so; however, I've seen commercial software run on Gentoo. I suspect you could get it working on Slackware with no more effort than a "Slackwearer" [sic] would be able to handle. I suspect you will find that the reason so many software houses act as if Red Hat = Linux are:

1. That for many moons (due to restrictions on trademark use) it was indeed packaged by third-parties as simply "Linux";

2. That in the business sector it outweighs use of SuSE, its nearest competitor, by a factor of 8 to 2.

Package management tools like APT are a good idea for managing core OS updates, but a really bad idea for the installation of apps.

Why?

This is one area where Windows gets it pretty right - the OS comes bundled with very few applications, and the apps it does come with are all written and maintained by the vendor.

On the contrary, it's all wrong; Magical Microsoft OSes are barely usable on a clean install (ok, technically the OS's are barely usable with the computer fully loaded, but you know what I mean).

MS manages the core OS and these packages with its own package manager (Windows Update). Many apps you install on windows have their own automatic update features etc..

Which means you have a million and one different programs to search for when you (re-install) the OS, and yet another update program screaming at you when each of the apps wants to be upgraded; if Lovely Linux systems did it this way they would be criticised for "a million and one different upgrade systems". (But unlike the there's-a-million-and-one-apps-to-do-the-same-thing, argument, it would be valid.)

Where Linux distros get it wrong is by bundling sometimes thousands of apps with the OS, and then trying to manage them, which is simply impossible to do properly in a centralised way.

Well, Gentoo and Debian seem to manage this Herculean and impossible task. I'm not saying they're aren't mistakes and problems; I'm saying that Windows is not the Wonderful Be-All-and-End-All the Worshippers at the Altar of Good Gates say it is.

It also has the potential to backfire on developers and users alike - say you install Gimp for the first time on one distro and it is horribly buggy, due to poor packaging. Your first impression of the program is not very good, and most people don't give things another chance, so that user is going to move on to something else, even though the original developer had nothing to do with the bugs that were created by a poorly packaged distro. Thereby one program gets a bad rap because of one distro. The ISV should shoulder the responsibility for the app, and get the kudos as well, not the distributor/packager).

Remember DLL hell? As for "one distro's package manager buggering up and casting doubt on the quality of all the available versions of software X," I take it for granted:

1. That people intelligent and inquisitive enough to be investigating Linux learn very early on that these problems can be (a) temporary and/or (b) limited to one distro;

2. That these problems are by no means Limited to Linux and even Worry users of Wonderful Windows.

3. That for reasons 1 and 2 your Windows-worshipping FUD is invalid.

4. That a person sufficiently (and justifiably) outraged by the (lack of) quality in a Windows OS will also be responsible for the non-purchase or use of Windows-only apps. The user informed by a company that wants said user to use said apps can easily retort that he wants said company to port said app.

Linux distros need to stop trying to bundle 374 types of kitchen sink in the distro, and instead focus on shipping a small, stable, compact Operating System with a limited set of basic software (browser, media player, text editor, file manager, image viewer and perhaps an email client),

Newsflash - there are distros that do this. Even fully loaded, however, few Linux distros I've seen contemporaneous with any version of Wonderful Windows take up as much space as vanilla installations of same.

For me, the ideal Linux distro is one that comes with a limited set of core functionality, an uncluttered, simple but powerful and tightly integrated GUI (eg XFCE, which I am very fond of), and the ability to install signed drivers and software from the Vendor's official website. It should not take up more than a couple of hundred megabytes.

You really think that by advocating Slackware/DSL/Puppy Linux, you are speaking up for the "Common Man"?

I like to have a single program that has a comprehensive feature set that does one thing well, rather than using 15 different programs to perfrom only part of a task at a time.

Not the Linux way. Though of course you can use OO.org if you so choose - and many of us do. Nevertheless, I'm not going to sit here and let you (or anyone else) dictate my choice of app, thankyou.

[/i] Since I rarely have more than a dozen or so apps installed over my base OS (at this point on WinXP as my main OS), I don't need any tools to manage my installed software, and since I use tools with comprehensive functionality, I don't forsee any need to change this by installing hundreds of little bits and pieces. [/i]

I wasn't aware you were Everyone. I also wasn't aware Wonderful Windows programs didn't spread cute little bits of themselves all over the filesystem (particularly if you attempt to install them on any drive other than C:) and the registry.

Linux as a whole needs to simplify, rationalise and become more organised and integrated to really take off in the way many of us would like it to.

Translation: Linux needs to become Windows. Except that Windows is not exactly simple (let alone rational), and it's "package management" is even LESS "organised and integrated".

Seriously, if you want to use Windows, kindly do so - and stop trying to turn Linux into it.

Reply Score: 3

RE[2]: the problem is...
by Obscurus on Wed 17th Jan 2007 02:44 UTC in reply to "RE: the problem is..."
Obscurus Member since:
2006-04-20

"I've not seen a valid reason for keeping source closed yet, unless maybe it's national-security-related. (And even then, as in the recent spats between the US and UK over their fighter-jet software, some degree of openness may be required). If you can introduce me to one, however..."

Uhm.. it's a pretty big one actually - it is called making a profitable living form the software you write. While some software is well suited to being commercially provided as FOSS on the basis of selling support for the software, in most instances this is not the case. And the model of "we will give you the app + source code for free, and we will sell you support" creates an inherent incentive for the software developer to deliberately create substandard software that requires users to purchase support. Good software should be so elegant and simple to use, so well written and documented that buying support is not necessary.

Not every software developer can afford to spend their spare time writing software, and commercial software companies need a stream of revenue to fund the salaries of programmers. Closing the source code prevents others from:
a)reducing your competitive advantage by using ideas you may have invested a lot of money developing
b)stealing the focus from a project by forking and fragmenting it (as has happened to a lot of OSS projects, particularly Linux).

"On the contrary, it's all wrong; Magical Microsoft OSes are barely usable on a clean install (ok, technically the OS's are barely usable with the computer fully loaded, but you know what I mean)."

No, while I have many gripes with Windows, I am very glad that they provide me with a fairly clean slate to start from - provided I have a web browser, I can add everything I need, and I have little to remove that I don't. I have quite a bit of control over it. I can also slipstream an installation disc so that I can set it up how I like from a single installation.

Now, there are plenty of minimalist Linux distros that also come with a similarly blank slate, but the problem is that I can't easily and painlessly download the few programs I want to use and install them without having to go through any number of time consuming or irritating processes. If you know of a distro which comes pre-installed with Klik or autopackage and a basic gui + browser and little else, let me know.

"...your Windows-worshipping FUD is invalid."
Where did you get the idea that I was a Windows worshipper? Certainly, there are things I like about Windows, but there are just as many that I don't (such as the needless pandering to backwards compatibility, bulky installation size, the GUI etc). Similarly, I very much enjoy using Linux (Xubuntu is my preferred distro at the moment), but there are a number of things that shit me about it (and I have already discussed most of what I feel can be improived in Linux). Same with OSX. Nothing is ever perfect, and openly dicussing the good and bad points of each operating system without fear or favour hardly constitutes FUD.

"Nevertheless, I'm not going to sit here and let you (or anyone else) dictate my choice of app, thankyou."

Where was I dictating what apps you use? Of course, you can use whatever you like, and nothing in what I have said would prevent you from doing this. It sounds like Linux as it is suits you well, and that is fine. I am talking about the one thing that is really holding back linux from widespread adoption (over-reliance on package managers coupled with poor binary compatibility between distros).


"I wasn't aware you were Everyone. I also wasn't aware Wonderful Windows programs didn't spread cute little bits of themselves all over the filesystem (particularly if you attempt to install them on any drive other than C:) and the registry. "


I never said I was, however, the fact that Windows and OSX remain the two most popular OSes has as much to do with Linux's fragmented and unfocussed chaos as it does with dodgy OEM bundling on the part of Microsoft and Apple.

And good Windows programs don't spread little bits of themselves around the PC, many don't even use the registry (most of the apps I use are self contained in their own folder)(granted, there are plenty of badly designed Windows apps that do horrible things to your system). Linux apps tend to spread themselves across a whole bunch of directories (something GoboLinux aims to fix), so I don't think you can honestly claim that Linux (in general) has an advantage over Windows or OSX on this point.

"Translation: Linux needs to become Windows. Except that Windows is not exactly simple (let alone rational), and it's "package management" is even LESS "organised and integrated"."

No, it doesn't need to become Windows, rather it needs to become more focussed and streamlined, and simpler to use by people who have better things to do with their time than fiddling with command lines and .conf files.

Windows'package management (Windows Update) is highly integrated into the OS, only deals with the core Windows components that ship with the OS, and doesn't affect other third party apps. You couldn't make it more organised or integrated.

As I said before there are things about Linux and Windows that I like, and if you combined them into a single OS and discarded all of the bits I don't, I would have an OS I could be very happy with.

I prefer to manage my apps my self, and let the OS take care of itself. Hence my desire for an operating system with an XFCE-like DE that keeps the apps separate from the core OS functionality.

Edited 2007-01-17 02:47

Reply Score: 2

RE[3]: the problem is...
by John Nilsson on Wed 17th Jan 2007 22:37 UTC in reply to "RE[2]: the problem is..."
John Nilsson Member since:
2005-07-06

And the model of "we will give you the app + source code for free, and we will sell you support" creates an inherent incentive for the software developer to deliberately create substandard software that requires users to purchase support

You are probably correct. Concider this:
Usability: "The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use."

So to design a usable product you have to spcify its users, their goals, and the context of use. The sharper this specification is the better you'll be able to design a usable software.

This means that you have to have quite a narrow focus to create a really usable software.

Now, if your business model is to sell software, the cheapest way to produce it is to copy allready created software. Thus the way to earn money is to create a cheap "orignial" that can be sold to many customers at a high price.

The "many customers"-part isn't exactly compatible with the "specified user, goal and context"-part though. So you compromise and try to creat cheap software for "all users, goals, and contexts". Which in the end means unusable software.

So the "selling copies of proprietary software" isn't really the way to "[g]ood software [that is] so elegant and simple to use, so well written and documented that buying support is not necessary" either.


Now if you base your business model on producing cheap software that is so narrowly focused that no competitor could use the same copies (because you allready saturated the market for that software), the bast way to gain profit is to share a common platform for base functionallity with your competitors and be the best producer of narrowly focused software based on that platform for a select niche of users, goals and contexts.

Reply Score: 2

RE[3]: the problem is...
by twenex on Wed 17th Jan 2007 04:34 UTC
twenex
Member since:
2006-04-21


Uhm.. it's a pretty big one actually - it is called making a profitable living form the software you write. While some software is well suited to being commercially provided as FOSS on the basis of selling support for the software, in most instances this is not the case. And the model of "we will give you the app + source code for free, and we will sell you support" creates an inherent incentive for the software developer to deliberately create substandard software that requires users to purchase support. .


Ah, an old unsubstantiated FUDstatement followed by an unsubstantiated slur. That may pass for an argument where you come from, but I can't say the same.

Good software should be so elegant and simple to use, so well written and documented that buying support is not necessary

Yeah, 'cos BSD and Windows software lives up to that *wink*.

Not every software developer can afford to spend their spare time writing software,

Who mentioned free time, until you did?

and commercial software companies need a stream of revenue to fund the salaries of programmers.

Prove that FOSS software prevents programmers from getting salaries. If you can.

Closing the source code prevents others from:
a)reducing your competitive advantage by using ideas you may have invested a lot of money developing
b)stealing the focus from a project by forking and fragmenting it (as has happened to a lot of OSS projects, particularly Linux).


Yes, in exactly the same way that allowing Fujitsu to make the same architecture PC's as Dell's is preventing them from "maintaining their competitive advantage" and is thereby bankrupting them.

Oh wait; it isn't.

Closed software/hardware just means the customer is at the mercy of the vendor. No thanks.

No, while I have many gripes with Windows, I am very glad that they provide me with a fairly clean slate to start from - provided I have a web browser, I can add everything I need, and I have little to remove that I don't. I have quite a bit of control over it. I can also slipstream an installation disc so that I can set it up how I like from a single installation.

Er, you can do that with Linux... As for "setting Windows up how I like," if we pretend for a minute that Windows could be set up to my satisfaction, that level of customizability (read: any) went out with Windows XP, didn't it?

Now, there are plenty of minimalist Linux distros that also come with a similarly blank slate, but the problem is that I can't easily and painlessly download the few programs I want to use and install them without having to go through any number of time consuming or irritating processes.

What, you mean like google software, download software, install software, click Next interminably, accept restrictive licence agreement and/or incomprehensible EULA? I thought we were talking about Linux.

If you know of a distro which comes pre-installed with Klik or autopackage and a basic gui + browser and little else, let me know.

Well, installing whichever distro uses click and choosing either something like "minimal installation" or unchecking unwanted software would seem to do the trick.

Where did you get the idea that I was a Windows worshipper?

Because you seem to be wanting to turn Linux into it.

"Nevertheless, I'm not going to sit here and let you (or anyone else) dictate my choice of app, thankyou."

Where was I dictating what apps you use? Of course, you can use whatever you like, and nothing in what I have said would prevent you from doing this.


If you get rid of all the distros but one, and put only one choice of software in the remaining distro, you are enforcing a set of standards which it would be almost impossible to break - just like Microsoft did with all their software.

It sounds like Linux as it is suits you well, and that is fine. I am talking about the one thing that is really holding back linux from widespread adoption (over-reliance on package managers coupled with poor binary compatibility between distros).

Ah, yes, but you see, the number of Linux distros is probably only outweighed by the number of grains of sand on a beach - and the number of different things different people say are "holding Linux back from widespread adoption".


the fact that Windows and OSX remain the two most popular OSes has as much to do with Linux's fragmented and unfocussed chaos as it does with dodgy OEM bundling on the part of Microsoft and Apple.

Yes, and PC's will never be as successful as Macs, Ataris, and Amigas as long as you can get them from just about any manufacturer you like. It's fragmented and unfocussed chaos.

Oh, wait...

And good Windows programs don't spread little bits of themselves around the PC, many don't even use the registry (most of the apps I use are self contained in their own folder)


There can't be many "good Windows programs" then; maybe you are thinking of the ones which cost $$$, which I wouldn't know about.

(granted, there are plenty of badly designed Windows apps that do horrible things to your system). Linux apps tend to spread themselves across a whole bunch of directories (something GoboLinux aims to fix), so I don't think you can honestly claim that Linux (in general) has an advantage over Windows or OSX on this point.

Yes, GoboLinux does install things in centralised directories (or pretends to) but I'm not claiming centralised app directories are a good thing (they're not, unless you have space to waste with statically-linked or endlessly reinstalled libraries); what I'm claiming is that Linux apps don't, in general, install stuff willy nilly in whatever folder they feel like (binaries are in /bin or /usr/bin or /opt/{packagename}/bin, for example, not in /var or /etc. Generally).

"Translation: Linux needs to become Windows. Except that Windows is not exactly simple (let alone rational), and it's "package management" is even LESS "organised and integrated"."

No, it doesn't need to become Windows, rather it needs to become more focussed and streamlined, and simpler to use by people who have better things to do with their time than fiddling with command lines and .conf files.


I fail to see how your suggestions above would make it "more focussed and streamlined" (especially since statically linked apps are the shortest path to bloat) or, if you're not referring to what you've said before, what you mean by "more focussed and streamlined".

If you don't want to fiddle with commandlines and .conf files, then may I suggest you use a Linux distro that does not force you to do that? (Mandrake, and SuSE being two examples).

Windows'package management (Windows Update) is highly integrated into the OS, only deals with the core Windows components that ship with the OS, and doesn't affect other third party apps. You couldn't make it more organised or integrated.

Except by having it deal with other third-party apps.

I prefer to manage my apps my self, and let the OS take care of itself. Hence my desire for an operating system with an XFCE-like DE that keeps the apps separate from the core OS functionality.

Sounds like one of the BSD's is in order. However, I do agree that it would be nice if there were a clearer separation between system and apps; Slackware probably comes closest to this within Linux.

Reply Score: 3

RE[4]: the problem is...
by Obscurus on Wed 17th Jan 2007 05:46 UTC in reply to "RE[3]: the problem is..."
Obscurus Member since:
2006-04-20

I am actually not suggesting that there be only one distro (though I think a bit of pruning is in order), rather, I am suggesting that if distros standardise the way they interface with apps, and have standards for core libraries, it will enable software manufacturers to not have to worry about distros, as they will all work with their binaries straight out of the box. It will also mean that if standardised libraries are used, there will be no need for apps to be installed with extra dependencies - all of the standard libs would be part of the OS, and there would be little or no duplication of libraries amongst apps. Anything that uses some boutique library can just incorporate it into the app without cluttering up the system with extra crap. And given the current size of hard drives, I think monolithic binary blobs are perfectly fine for apps these days - no installation method is simpler than dragging a single executable file onto your desktop or a folder of your choice, and if for some reason, you have an unusually large number of applications, it is pretty easy to create a utility to manage them, without resorting to a full blown package manager.

The fact is that Windows is still more user friendly than the vast majority of Linux distros (and given the shit Microsoft is prone to producing, that is saying something). Linux may be technically superior in many ways, but in the one way that matters most (ease of use for people who don't like computers but have to use them anyway), it(they) simply fails to hit the target. Not that it couldn't, it is just the issue of focus and integration that is the problem. And maybe this is OK - after all the things that make Linux great for what it is (freedom, flexibility, community) also work against the things that make for a great desktop OS: (limited freedom, well defined standards & APIs, commercial viability and support.

Far from suggesting that Linux becomes more like windows, I am suggesting that it adopts some of the features of Mac OS and Windows.

I'm not asking for every Linux distro to disappear - they all have their place (up to a point), but rather for at least one distro to break from the mould and do the things that are needed to give Microsoft and Apple some desperately needed competition.

"Yes, and PC's will never be as successful as Macs, Ataris, and Amigas as long as you can get them from just about any manufacturer you like. It's fragmented and unfocussed chaos."

And for that reason you will never have the tight, integrated, focussed experience you will get from Amiga or Macs. It is also the reason most people get computers from vendors like Dell etc., where the hardware has been well matched and configured in advance. It is somewhat misleading to say that PCs are successful because of their customisability, since the vast majority of computers sold are Dells and the like that have very much restricted customisation. PCs became popular because they run windows, which through an accident of history (certainly not merit) became the defacto standard for OSs, like it or not.

It's good to see we agree on keeping the system separate form the apps, though we obviously can't agree on the best method for doing that.

Edited 2007-01-17 05:47

Reply Score: 1

RE[5]: the problem is...
by twenex on Thu 18th Jan 2007 00:43 UTC in reply to "RE[4]: the problem is..."
twenex Member since:
2006-04-21


And for that reason you will never have the tight, integrated, focussed experience you will get from Amiga or Macs. It is also the reason most people get computers from vendors like Dell etc., where the hardware has been well matched and configured in advance. It is somewhat misleading to say that PCs are successful because of their customisability, since the vast majority of computers sold are Dells and the like that have very much restricted customisation. PCs became popular because they run windows, which through an accident of history (certainly not merit) became the defacto standard for OSs, like it or not.


Whether or not you can customize PC's was not the point. The point was, rather, that you are not dependent on one vendor for IBM-PC compatible hardware. Becoming dependent on one vendor for computers, operating systems, word processors, or anything else is The Road That Should Not Have Been Travelled.

Reply Score: 2