The package installation problem is one of the primary barriers to desktop Linux adoption. Most if not all solutions so far have addressed the wrong problem (at least for desktop users) — resolving dependencies at package installation time. A much better approach is to ensure that as few dependencies exist as possible. While this might seem a lofty goal, given the open source development emphasis on reusing as much code as possible, this goal is indeed achievable through a process of desktop component standardization.
“The package installation problem is one of the primary barriers to desktop Linux adoption.”
Yeah… 3 years ago. Arch, Gentoo, Source Mage, Debian, Mandrake, even Fedora with yum; NONE of them have dependency issues anymore.
I still don’t see how people can think pacman -S [program] or emerge [program] is hard…
No repo will ever have every single *up to date* package.
Debian sorted this stuff out years ago…
Installing software is easier on many Linux distros that in Windows.
apt-get install open-office has got to be easier that going down the shop to by M$ Office, bringing it home, unwraping it, placing the CD in the tray and runing setup.exe!!!
>> The first is that distros wishing to carry only one of the
>> major desktop environments might not care for the extra
>> bloat that LDB compliance would necessitate.
…
>> The libraries are small enough that it would only add an
>> overhead of a couple of hundred MB to a distro.
Yes, only a couple hundred MB’s. I also love the part where he says that deveopers would have to maintain rpm’s and debs of all their products. Since when are rpm and deb the only two packaging systems? What about distros like slackware and gentoo that don’t use rpm’s or debs? And what about source? This guy barely mentions source in the entire article, and unless he wants to reccommend rewriting autoconf/automake, then all his vaunted ideas will be useless for anybody that uses source.
I’m also somewhat leery about the plan to release new versions of the “LDB” at specific intervals. I mean, what if GCC 4.0 comes out in the middle of a release cycle? Nobody will be able to use it and keep their “LDB compliance” until they get a whole new version of the LDB core. And they need to get the whole thing in one go.
No, I’d say that I don’t like this idea very much at all…
Actually, Arch does a pretty good job of it. Sure, they don’t have _every_ package, but I don’t need the 10,000 that FreeBSD ports and portage offer – every package I’ve needed I’ve been able to get from Arch with a simple pacman -S [program]. On top of that, they’re all kept very up to date. The day after you hear of a program being released Arch has packaged it, tested it, and updated the package so that when you run your pacman -Syu later that day it gets upgraded.
Yes you are limiting to the developer(somewhat) but to have portable, dependency-free, linux packages would be a dream come true.
I think this problem is often easier in gentoo than in many other distros. Changing the name of an ebuild is usually enough to get you a new version of a package.
cp foo-1.0.2.ebuild foo-1.0.3.ebuild && emerge foo
What everybody fails to mention is the situation (which I am in) where one doesn’t have an internet connection in Linux, and therefore needs to download what one needs in Windows, switch to Linux, and then pray that the install doesn’t have a dependency! And I’m sure I’m not the only one that is in a similar situation. Repos are useless to me, I’ve never used yum or apt simply because Linux doesn’t currently support my network adapter.
Well, personally, I don’t think Linux is for people without network cards. If you want it, fine, you can do it, but it won’t be as good. That’s why there are different OSes for different needs. You can’t make everybody happy all of the time.
A Linux-compatible 10/100 NIC can be had for $3.00, shipped. Compatible PCMCIA cards can be had for less than $20.00 shipped. Heck, Linux even supports Winmodems now! People without a net connection in Linux constitute the extreme minority.
Not a word, ok but you get the point.
Dropping deps is like saying we all have a gig of RAM and don’t mind having libs all compiled into our individual programs.
Well guess what, shared libs are good; quit whining and learn to follow instructions. You can’t even program a VCR if you can’t figure out rpm package installation (the old fashioned way); and if you can’t program a VCR then you don’t belong administrating multi-user computer systems.
END OF STORY, buy another appliance and stop trying to turn my PC into one!
but then I am only a SUSE n00b, not like the l33t Debian and Gentoo haxx0rs out there compiling teh interweb as we speak.
application developers should always provide statically compiled binaries like Opera does. Sure, you’d have bigger apps to download, but you don’t have to worry about dependencies at all. In these days of broad bandwith and large drives, this is the simplest way to get any given app to work on different platforms.
follow opera’s lead! statically compile certain apps….but obviously wouldn’t work for some core components of desktop environments like kde or gnome, but works just fine for everything else.
I think this guy has put into words what I’ve been thinking for a long time now, but never really communicating. Big up to Arvind for that, I think he has a smart vision.
There are basically (imho) 3 pieces to the “make linux software installation great” puzzle:
– Solve the scalability/distribution issues that the current diversity of packaging systems presents. This is what autopackage was designed to do. Solving this means packaging is responsive, accurate and universal enough that we get a great experience on all types of system.
– Improving probability of success by standardising dependency sets: this is basically the unified platform/package base idea. For software which only uses packages from this base, it means on a compliant system the software is guaranteed to install correctly assuming a suitable package for your system. For software that uses base+extra stuff, it increases the probability of a succesful install.
– Integrating package management deep into the desktop UI: extensions to GNOME and KDE through freedesktop.org specs are the way to go here, with a similar abstraction layer to HAL for packages: a PAL, if you like (note: no technical similarity should be implied from this description).
For the best system possible all these pieces must be developed and combined. None of them are trivial or easy to do. You cannot simply leave one out: without a distro-neutral dependency-resolution capable package system the system will not scale, without the desktop base set it will not be reliable, without the desktop integration it will not be easy.
“Well, personally, I don’t think Linux is for people without network cards.”
thus you are automatically cutting out all of those who dont have access to >56k internet bandwidth …
moreover, i’ve heard many times “there is surely a reason it doesnt cost anything so, it mustnt be that good”, and not having tangible and durable media, imho doesnt contribute so much to the reputation of the thing …
“dont feed the troll” (Fangorn)
imho, thinking an article is troll, means one has a “troll attitude” towards the article genre or toward the author, thus warning others for intentions they cant be accused of having for sure
instead, we are talking of open source software: what is the greatest advantage of open source? (you teach me) freedom
and isnt the one accusing others of trolling, limiting their freedom to give opinions and express their criticism towards a product which is objectively far from being perfect?
Assuming he’s dual booting, it means the computer has a way onto the net, either broadband (with a nic) or dial-up, that maybe doesn’t work in linux.
I have a similar situation, I can connect to the net (ADSL) in windows no problem, but for some reason it stopped working in Linux after 2 days.
It seems like, if you wanted to deal with the dependancy problem, you have two approaches:
1)standardize a “base” system so that developers could know what to expect people should have installed. The problem here is, with open source, how do you enforce standards? If some new distro doesn’t like the “default base” libraries, they can easily choose to use something different.
2)have developers include the necessary libraries with their packages. Problem: you’ll get many copies of the same libraries.
I think a good solution would be a combination. If there are certain libraries that tons of programs are going to use, then everyone sort of agree to put it in the “default base”. All other libraries should just go along with the package, so that, along with everything else, you don’t get conflicts between different versions.
It’ll increase the disk space usage(bloat) to use either of these methods, but using them together will cut down on this. Common libraries, instead of having a different version for each program, will have one copy. On the other hand, you won’t be running around trying to include little rare file into your base to support everything, because if it’s rare, it should be distributed with your package.
Sure, it’ll still lead to more bloat than if you managed all your dependancies, but when you’re talking about desktop systems, not embedded systems, a little bloat is manageable.
I made a similar comment last week about the shareware article:
There’s always going to be things that distributions like Debian and Gentoo miss, no matter how comprehensive their software archives. Regular users like having an easy way to install all the little pieces of software they find on the internet.
Look at freshmeat or gnomefiles, I doubt too many new users would be able to install most of that software.
Other problem of course is the continuing C++ ABI breaks, but I think we just have to ride them out and hope they don’t happen again. Or not use C++. But I think the KDE guys would not be happy with such a solution
>>and isnt the one accusing others of trolling, limiting their freedom to give opinions and express their criticism towards a product which is objectively far from being perfect?
I am sorry if you felt that I was trying to limit your freedom, but some articles like the one titeled “Why Linux isn’t ready for the Desktop” make me angry, they spend their time criticising GNU/Linux and the software that are freely available for modification also, but they do nothing to put in practice what their want.
Many people, the most, are quit happy to run Linux, which is somehow not perfect, really really superior to MS Windows in term of security and stability.
Now I should reconize that this article is less agressive againt Free Software than the one I quoted.
Sorry for my first reaction, and sorry for my very bad english too 🙂
The trouble with Binary Compatibility, is that ultimately, it prevents you from moving forward. One of the key benefits of Linux at this point, is that there are annual revisions of the major distros, and people are willing and very happy to accept them. Few people want to selectively upgrade beyond what they can install from their media. It is that level of user grace that has allowed development. It can’t continue indefinitely, but the speed of growth can’t, either.
But breaking BC is a necessary evil in many cases, because the benefits are *that* significant. To be honest, KDE does it once every two or three years at the moment, based mainly around QT development, and that is a pretty good reason, with a large number of changes.
But you can’t keep BC forever, otherwise you end up with cruft like the Win 3.1->95->NT->XP problems, with some things failing to run, or breaking, and too many bad compromises in API developments that stifle innovation.
Urpmi might be the solution, except it can remove your gui (kde) without a proper warning (“OK to remove k***?”). The other problem is of course transporting the software via a CD, but if you’ve got broadband and urpmi, dependencies are not much of a problem. Is it possible to copy a whole repository to one or more CD’s?
How about instead of statically linking your stuff. You include in your download all the deps for your program (except huge stuff). So say my app depends on:
gtk2
libgcrypt
imlib2
Then I will make these rpm’s
my_app.rpm
gtk2.rpm
libgcrypt.rpm
imlib2.rpm
and then tar -cvzf my_app.tar.gz my_app.rpm gtk2.rpm libgcrypt.rpm imlib2.rpm
Then you can also include instructions. Informing the user of how to install these, and what order to do it in.
Or we could grow up and learn how to install packages and dump this naive idea of statically linking our stuff (Opera is small, and does all it’s stuff except for QT, some applications need a bunch of other libraries).
Besides, you’re ignoring the benefits of shared libraries.
1.) Hard disk usage
2.) Memory usage, this is huge
3.) Optimizations in one library affect a lot of programs, this is good.
If libc gets twice as fast (exaggerating obviously) then most of your apps get a bit quicker as well.
Statically linking is a cop-out, not a solution.
Whilst I see where the article’s author is coming from the issue is one that only exists if we want it to exist.
If our proverbial Uncle Tilly / Grandma cannot cope with dependency issues then install Lycoris, Lindows (sorry, Linspire), Xandros etc for them. It will have everything they need for their computing needs.
If you wish to add extra packages that don’t come pre-installed with your distro of choice (and lets face it, there is a huge choice of distros out there) you can use apt-get or yast or whatever to help you on a distro which has many more packages available for use ie Debian.
If that is still not good enough then yes, we do need to go the ./configure, make, make install dependency route. Which certainly means some work and effort (as I am currently finding out) but it also helps anyone who does it to see how their distro operates.
(With due apologies to Uncles and Grandmas)
I wish that people would not criticize people that are crying out to the community to help fix some of the problems that the linux experience brings to users. Statements calling this article and others “trolls” are just passe ways of not trying to deal with the issues. There is a clear need for some centralized, focused direction of a distribution for the common people. As many have pointed out as well, there needs to be more education given to people that want to learn linux as well. The people that write articles such as this and the one titled “Why Linux isn’t ready for the desktop” are usually people that are just savy enough to use the distros but don’t have enough programming experience and don’t have the time/will to learn how to do the programming themselves. Lets take a different attitude of tolerance towards people voicing their concerns about linux distros and try to work for a better goal as a community instead of calling each other names.
Has SkyOS won the war for easy desktop usage operating systems?
If Sky keeps going ahead at pace its at then linux would have been beaten. But ofcourse Skyos went private so thats that.
Linux is just a kernel. There is total lack of centralized control about linux which is great for hackers/experimentalists/c programmers but leaves out the regular user who don’t care about operatings systems. They just want something that works and is easy like microwave.
i have little faith in the linux distributions to solve problems because of these reasons.
Maybe openbeos . New name is Haiku.
o_O
The subject of your post is “has SkyOS won the war ?” yet your conclusion is that you have faith in Haiku. Ok, Mr. –.ipt.aol.com.
I come from a country where Dial Up Internet connection cost a lot and boradband penetration is low. I personally think that this is a major thing that blocks me from obtaining new Linux applications.
When I download things (say, Windows Shareware program) they usualy come in one zip file and runs out of the box. Meanwhile for Linux, I download one package just to find out that I need to download another dependancy. This cost can cost me one big hell of telephone bill.
Up to now I sense that Linux packaging/software distribution is only limited to those with broadband internet connection not limited ones.
First I want to say that I have only used package management with very limited experience with RPM <Yellow Dog Linux> (used it once or twice), Emerge <Gentoo> (period of about a month), Ports <FreeBSD> (a month or two), and I used those a while ago so I might have a couple of things wrong, or what I am saying might be ‘outdated’.
The problem I have with package management is that you are dependant (spell?) on one source for your packages. Some packages were out-dated in the package management directories, in other cases, the packages were not even there. In those cases I was out of luck, because if I tried to download the source and do a “make install” from sources from the home page of some project, it would mess-up the package management system. Also, if you were to buy (gasp!) software from a store and try to install it on your computer, you’d have all sorts of trouble.
I don’t know about other people, but I think that the best way to do software installation is Mac OS X’s way- drag and drop. You drag and drop a .app file into the Applications folder (the .app file is actually a directory but that is hidden from a user) and the program is installed. Anything else that needs to be done for installation is done by the program the first time it starts up. For the shared libraries, each app can come with its own libraries and then try to copy them to a (hidden) libraries folder and if the library is not already there it will be copied there for the application to use. If it is, then the program will use the old library. (Can this work? I think it might be able to but there might be some things that might not work. ) The library folder might not be the most elegant solution but it does not seem too bad. [Uninstalling is another story, but I have never seen a system that does uninstalling right]
I don’t understand what developers don’t include the libraries required for their programs to work. One of the ultra-sexy features of GPL is the right to redistribute. Why aren’t we taking advantage of it?
I still don’t see how people can think pacman -S [program] or emerge [program] is hard…
The hard parts are:
* figuring out what goes where [program] is.
* figuring out what to do if the [program] you want isn’t in the repository.
* recovering from the almighty mess created from trying to manually compile/install some [program] that wasn’t in the repository.
Hehe just read your post @ autopackage, those damn gcc guys =)