Username or EmailPassword
I also love the way this guy has solved the dependency hell problem. They set it up so that all applications and required dependencies are stored in its own folder. I can see where this might add some bloat to the over all system, but you know what? I'm sitting on 250 GB of storage of which I have used 35.6gb, and if a little bloat can save me from linux dependency hell, I'd sacrifice the extra space
it would be ok, if it was only your disk space. it is, however, also the space on the server, which provides you all these packages and all the mirrors.
bigger packages also require more bandwidth to download. and bandwidth is NOT cheap everywhere. and why the heck should i download the same library ten or twenty times. just imagine all gnome apps. i want the gnome environment, but i don't want the totem player, or the gedit editor. either, everything is in a big fat "gnome" package (so much for "little" bloat), or it is nicely divided as in e.g. debian, but i have to download libgtk+2.0 for every gnome application. makes zero sense.
also, there is the developer time - instead of building libfreetype2 once, they have to make sure it builds correctly for each appplication that uses it.
all this for a NON-problem. sorry, but no matter how you try to twist it, i have not experienced dependency-hell since i moved to debian. apt-get (aptitude) solves the dependencies for me. you know, the computer is the tool and it should work for me (figure out the dependencies and download only the minimum, use only the minimum space), not the opposite way (me buying more disk, paying more bandwidth) to have the same library installed 20 times.
someone has already brought up the security problem - e.g. zlib. widely popular. many apps linked statically to it, which is almost the same (from the distribution point of view) as having it bundled with each app and linking dynamically. once a bug was found, you had to re-install all the packages. i'd rather have one non-functional program on my system (due to a new version of a library) than one copy of a library with a security problem.
having the same library installed more times could also break the benefit of dynamic linking - only one copy of a library in memory. either, all the library directories from all installed self-contained applications are in LD_LIBRARY_PATH. in that case, the first copy of a library will get loaded for all applications that use it, hence no point having a separate copy for each applications. or, when a self-contained application is started, only its own library directory is in the LD_LIBRARY path. in that case (and i'm not 100% sure on this), the linker would consider /app/foo/lib/libxml2.so to be a different library than /app/bar/lib/libxml2.so and load both of them - instant waste of RAM. i'd rather use my RAM on file cache than 20 copies of the same library.
talking about upgrades/reinstallation. try to think about, how fast you can update a system, where a package managers keeps track of the installed software and how fast a system, where you have to download and double click for each application installed.
i use both debian (@home) and w2k (@work). there is a set of applications i use on both systems - firefox, thunderbird, gimp, gaim, gvim, openoffice. to update them on debian, i do "aptitude update ; aptitude upgrade". 2 commands for 6 applications. actually, it is 2 commands for all the applications installed. on w2k, i need to download 6 files, unzip one of them (gvim), run 6 executables. with 10 applications, that would be downloading 10 files (from different places !) , running 10 executables. with 100... i hope you get the idea. how can someone consider the latter an easier way escapes me... (instead of typing aptitude blah blah, one could click-start synaptic a click a few buttons to do the same). yes, people are used to it. people were used to driving without security belts, licking their pencils. not everything that people are used to do is the correct way to do things.
if you really need to have installation from within a browser (which i also don't like - you know, the right tool for each job...), it could be solved by mime types and respective handlers.
<cite>all this for a NON-problem. sorry, but no matter how you try to twist it, i have not experienced dependency-hell since i moved to debian.</cite>
What a load of bull! Anyone who's used Debian for a couple of years know how problematic it has been for Debian stable to get stuff rolling.
Debian represents the utmost failure of a free software project. It's become unmanageable for all those people to create packages in a timely manner.
Look how long sid took! YEARS! Ubuntu has taken its place.
C'mon! You think we're all newbies here?!
You think we're all newbies here?!.
No. I think you are someone without a sound argument hiding behind exclamation marks, rude words and not signing your words.
if you read my entry, instead of just finding a bit to flame, you would have noticed i was discussing the technicalities of installing packages, not their quality or up-to-date-ness.
call me, when you get ubuntu running on something else than an intel machine. still, i consider ubuntu a good thing.
It would be ok, if it was only your disk space. it is, however, also the space on the server, which provides you all these packages and all the mirrors.
Where have you been? PC-BSD is for desktop, but for the server!
jziegler, I think you miss the point completely.
If you're comfortable with an advanced package manager, it means the PC-BSD simplified package management is not for you: so, even on PC-BSD, you would use the excellent FreeBSD ports, since they're totally available there too (it has a complete FreeBSD operating system under the hood).
Your mistake is thinking that everybody has the same priorities as you.
If somebody has a big HD (and today, most HDs are) and they happen to want to get the work done *the fast way* - without even knowing or caring about what a dependency *is* - then the PC-BSD package manager is the best solution for them. It's as simple as that.
Many users of a desktop OS are exactly like that. And it doesn't mean that they're stupid or something - they simply *don't care*. Maybe they're not interested in a PC's inner workings, or maybe they've got more urgent things to do.
ulib, have you even read what i've written? the HD usage is only one part of the problem. downloading the same things 20 times also cannot be faster than downloading it only once.
i was not discussing user preferences or ease of use. i was discussing the technical implications of such a solution. it's a bit sad that no-one seriously answered me on that points.
but yeah, i'm strange. i try to understand things around me.
One thing, I read, which by my understanding means, it only maintains a copy of a library if the package calls for a version that isn't already on your system.
Kind of shoots down the whole 200 versions of the same library arguement you got there. It only grabs another library version and places it in that app folder so you don't have to recompiling all your programs for an updated or outdated lib.
OK. That sounds reasonable.