Mandriva engineer Claudio Matsuoka questions in his journal if the recently touted problems in Debian Sarge upgrade are a result of package dependencies in distribution getting too complex, even for APT. Has the time come for the next generation of package managers, led by Smart?
Many people use Debian because they can cut it down so much that their 120MB disk can be useful. That’s why debian breaks things down too much.
There is *no problem* upgrading Woody to Sarge if you use aptitude instead of apt, as is clearly stated in the readme.
Stop spreading FUD OSNews.
They are just trouble, and the amount of memory saved is not worth it compared to the configuration complexities they create. Work on better modularizing base libraries, and on linker deadcode removal (I think Apple’s gcc/ld has this).
Jacob
Jesus, this article is not about bashing debian but takes the problems apt apparently has as a starting point to discuss the future of package management.
So please, let’s stay on topic.
People are overusing the word “FUD” to the point where it will soon likely have no real meaning. It’s instances like this, where there is no Fear, Uuncertainty, or Doubt in this post or in the linked blog entry. It’s an experienced engineer talking about his concerns with current package management, and how in his opinion it’s getting to the point.
This isn’t a “Linux sucks” post, it’s a “lets make it better” post, so really, how can this be FUD?
To share my thoughts, I think that dependency hell will be one of the major problems in Linux which will have to be solved in the next few years, though I’m afraid that there is no easy solution because the problem is inherent to the way Linux – and its applications – are developed. One developer writes a library A. This library is used by an application. Another developer wants to use a library providing the functionality of library A, but is not happy with the way A is implemented, and implements library B – which does the same as A but in another (maybe insignificantly different) fashion. Skip a few years and blow this scenario up a bit, and you got a lot of applications which use lots of libraries which, in the end, all provide about the same functionality. That’s a big waste of memory, disk space, and above all, developer time. Now how do we get around this? And now it won’t get off-topic, but a little broader.
In my opinion, there are many ways to solve a problem, but most often, there’s only one “best” way to do that. Of course this “best” way depends on the circumstances, but – to get back to reality – I don’t think it’s necessary to have a dozen different sound systems, window managers, editors, file managers and stuff around. Now before you get pissed, let me say a few things: Of course I like the diversity of Linux and its apps, mainly because diversity is necessary to have some kind of progess and evolution. But what evolution also means is that at some point, some apps and libraries will have to go because there’s another app or library that’s nearer to the “best” way of doing the job it’s designed to do. This means that old, obsolete libraries have to be replaced and other pieces of code which depend on them have to be adapted. Proprietary OSes and apps probably have the same problem, but on a lower scale because in the free software world, it’s pretty easy to fork
Now the Smart package manager certainly is a step on this way, but I think Linux developers (and MS, Apple and every OS manufacturer as well for that matter) will need to constantly check library dependencies, replace old libraries and thin out dependency trees to keep their OS fast, stable, secure and slim.
Aptitude is already pretty smart. If you upgraded aptitude first and then used it to upgrade Woody to Sarge (as adviced in the official upgrading guide http://www.debian.org/releases/stable/i386/release-notes/ch-upgradi… ) you shouldn’t have experienced any problems. Like Matsuoka suggests, the answer to the kind of problems that Debian and Mandriva have recently discovered from their base systems can be best solved by constantly reviewing packaging policies and eliminating any circular dependencies that are found.
Matsuoka also writes: “I believe that Debian’s policies and inner workings are still the best among Linux distributions, but that must be continually examined and improved.”
takes the problems apt apparently has as a starting point to discuss the future of package management.
And you should take the lead. Of course, no package management system is perfect, and apt isn’t perfect either. But there are some that are Good Enough, and most certainly apt is one of those. Added, a distro with more than fifteen thousand packages in its repositories will eventually have problems, some of which probably others don’t. Updgrade problems ? Well, please, how many major release upgrades have either of these complaining guys performed on other distros ? If some, or many, than please, tell me they didn’t have some issues there. Oh, they did. Then why do we end up singing over the undug grave of debian, again ? Go get some clue.
All operating systems use shared libraries, it’s not possible to have acceptable performance without them.
I think the main problem here is that the Linux community never developed any concept of the platform. That would have required cross-distro co-operation and community cohesion that just didn’t exist back then (maybe still doesn’t). So these very complex dependency management solutions evolved in order to duck the problem of “What can my app depend on?”. The answer, unlike on MacOS or Windows, became “anything”.
So now we have this quagmire where hugely complex programs and algorithms are needed to keep the system sane – it’s apparently beyond the ability of programs like apt these days given the mess that the Sarge release cycle became.
It’s becoming increasingly clear that we need a large, solid desktop platform or base set to reduce the load on depsolvers and help things like autopackage work more reliably. Dumping more and more packages into programs like apt or smart won’t work.
BTW I recently tried smart. It’s nice! Unfortunately it’s very slow at loading the cache, and it sits on 99% seemingly for ever. But it does the job. The UI is not good, like all such programs … do users really care to see the dependency tree before installation? Probably not. But there’s not much that can be done about that short of only using smart behind the scenes.
that’s not what is classically defined as ‘dependency hell’, and it’s not a problem I particularly recognise, having been involved in the development / testing of Mandrake/iva for several years now. Apart from the obvious GTK / QT, can you perhaps come up with another example of the situation you describe?
FFS, the guy is even praising Debian:
“I believe that Debian’s policies and inner workings are still the best among Linux distributions”
How about reading the article for a change the next time around before you start your bitching?
Again, this article is not in any way about bashing debian, it’s about improving package management.
Would it be easier if KDE and Gnome came up with some type of packege management? Is it better that each distro has its own packege management? Or should the community as a whole come with one final solution that way no matter what system you run the packege manager will be the same? I do think though that somthing like MS or even OSX has should be done, if OSX can do it, Linux can do it. Also I really like autopackage, http://www.autopackage.org, developers create the autopakages, and when a user downloads them with just three clicks its installed, no mater what system, rpm, dpkg. tar, etc. autopackege will go on the net and find the dependencies for that particular package, pretty nice. Only problem is you have to have internet connection.
Would it be easier if KDE and Gnome came up with some type of packege management?
I think KDE or Gnome should not be involved in this matter because KDE and Gnome are used not only by linux.
Several distro maintainer should meet and discuss the whole thing together to address differences between distro’s way of maintaining packages and dependencies.
There will always be several ways of maintaining packages due to OSS nature. But I hope there will be common way to help linux more interesting to software company.
I am a Debian user for years. Before Sarge came out, I tweaked my Woody installation alot to get it updated.
When Sarge was released last week, I backup’ed everything and made a new install. I think it was a good idea since my Woody install was so tweaked.
But I was surprised… I got some weird problems with apt/aptitude/synaptic. First, when I did a “test install” just to see if everything would install fine, I installed x-server before installing x-windows-system. The result was ok since x-windows-system depends on x-server. And It didn’t install xdm, it was not a required dependencie. Good, I hate it and I never use it. I always launch X from terminal.
After some more tests, I was ready to do my real installation. But when it comes to install X, I told myself: let’s install x-windows-system, all the dependencies like x-system will be installed automagically.
But then I got a problem: xdm was now a required dependencie. Why? It’s not even required for X to run. I used X for years without having xdm installed. Even Debian didn’t need it in my test installation.
Well, I tried to install an alternative to see if I could remove it. I installed kdm and gdm but no luck, I wasn’t able to uninstall xdm.
OK, I just told myself that I was able to live with that… just a quick change in my run levels and voila.
Now I was ready to install Gnome. But when I picked up Gnome in synaptic, it added the whole Mozilla suite as a dependencie. Weird, I wasn’t even able to uninstall it without uninstalling Gnome. This was really weird since the default Gnome browser was also installed and I couldn’t remove it either. 2 browsers for Gnome!?
Well you see the problem. It’s a bit annoying and I hope that the Debian team will be able to get around the problem shortly.
Please share your experience with it, maybe someone in here found a solution.
You’ll want x-window-system-core (instead of x-window-system) and gnome-core (instead of gnome).
Thanks, I’m trying it.
”
Now I was ready to install Gnome. But when I picked up Gnome in synaptic, it added the whole Mozilla suite as a dependencie. Weird, I wasn’t even able to uninstall it without uninstalling Gnome. This was really weird since the default Gnome browser was also installed and I couldn’t remove it either. 2 browsers for Gnome!? ”
I dont think, this is really a Debian or GNOME problem. Epiphany would ideally depend on a browser engine and Mozilla has a short term plan (around 6 months or so) to do that.
http://wiki.mozilla.org/XUL:Xul_Runner
Currently it directly depends on Mozilla browser itself but it should be possible to uninstall both of them without uninstalling GNOME altogether. Check the dependencies list.
Note: This is potentially offtopic
you say yours is better and I say mine is better….
maybe smart is better but then again I dont have any complaints about apt so why would I use smart?
“what went wrong in sarge’s release”
it took too long, thats about it….
breakage – yea probably will be some but I would think anyone that is installing a OS released YEARS after their current OS will probably be ready, especially considering how much linux has evolved and the volatile nature of it now compared to when woody was release….
btw-how is smart on a 133mhz system, still smart?
Wouldnt it of been easier and more productive to tweak APT instead of starting from scratch?
So – maybe Debian should have waited (at least) one more year, and test all the potential upgrading problems, huh…? Nah… That would truly be a never ending road for sure.
The longer the time periods between releases get, the more changes the software goes though in the meantime, and the more difficult upgrading the whole system gets as a natural consequence (changing dependencies etc.etc.). This is especially true with Debian with its enourmous amount of packages, and as it tries to make no major chages to the packages of the stable release except security updates.
Debian 3.0 Woody and 3.1 Sarge are just quite different sets of software.
A good solution to avoid such difficulties in Debian system upgrades is to – guess what – have faster release cycle. When software isn’t too different from the software of the previous release, system upgrades are easier.
Since there is no cost is endlessly discussing these questions, there is very little chance that an agreed solution ever emerges.
Part of the problem is the base principle of linux distros : a big monster application that does everything. I use it but I think it’s unsound.
Application developpers don’t have a target to develop against. So they choose their libraries and the version that works best for them and distro developpers try to make it work in their own framework. The surprising thing is that in my experience, it works most of the time.
But I can’t help thinking it’d be simpler to have an agreement of what system components are and an 18 months agreement on the “current” version of these components.
Then application developpers would know what to build against and provide missing libraires in their own (also standardised) private location. And they could package themselves, since they would never have to worry about specifying versions of system libs their software requires. Then if an application depends on another one, it’s a lot simpler as well.
It’ll become like that as soon as Novell or someone else becomes so dominant that they in effect define what linux really is. Redhat I think qualifies on servers. let’s wait a couple of years and see who wins the prize on the desktop.
…Anyway, the Debian system upgrade tools in themselves (aptitude dist-upgrade etc.) are still perhaps the best working ones in the whole OS industry AFAIK. That’s why Debian people say that “you only need to install Debian once”.
” A good solution to avoid such difficulties in Debian system upgrades is to – guess what – have faster release cycle. When software isn’t too different from the software of the previous release, system upgrades are easier.”
and THAT exactly is the plan, THANK YOU I couldnt of said it better….. excellent post!!!
———————————————————-
Hey, i know lets blame every problem on the fact that debian sarge took so long….shall we! We need a new package manager because sarge took so long (dang already done), hey we would have world peace if that dang sarge didnt take so long…
Go honk your SMART horn someplace else please….
Hey, I didn’t try to troll, or to just moan about the long time that Sarge took… Besides, I’m usually quite happy with the way the Debian project handles all things (so much so that I’ll probably switch back to Debian after my brief period of using Ubuntu as my main desktop for a few days now. I think that Debian’s package management works better and there are not so many strange dependency problems as I’ve experinced with Ubuntu).
But, back to your sarcasttic comment, my post, and the subject: Just consider this example: which would be easier, to upgrade from Windows 95 to Windows 98, or from Windows 3.1 to Windows ME (they all share the same base)? It is just plain common sense that upgrades get more difficult the more different the software gets, and developing a fool-proof upgradings system gets more difficult too. Even the smartest of package managers might not be handle all the various dependencies of various different systems properly if a system should be upgraded to something completely different from the previous one. That just is a major point in system upgrades.
dukeinlondon, you’re pointing out the problem with linux in general in that there is no linux os (there are redhat and novell and debian, etc… os).
Linux is pretty much unique in regards that it has “distros”.
I think the problem is that linux grew up in such a haphazard fashion that while everybody was busy doing their own thing nobody came together and figured out what should be common to all linux distros and in which ways can distros go off on their own.
it wasnt sarcasm! sorry if it came across that way…
the top part was a real THANK YOU I couldnt of said it better, my post is right above yours and you said so much better than me!
the bottom of my post was sarcasm that was directed at anyone that thinks that somehow apt is broke and that is what caused sarge to take so long and now we should use SMART. Sounds like they used the long release as a excuse to say apt is broke and then to tout their product….
now, about you using UBUNTU, you need to go to the closest debian repository and say three hail-debians to get right! and dont stray again…
🙂
Smart might be the future, but I have tried it in Debian and I feel that apt and aptitude still work (a lot) better. Besides they are being developed all the time.
Apt/aptitude can easily handle mixed systems’ dependencies. Smart can’t. Smart is unaware of apt-listbugs. Smart doesn’t have the equivalent of “apt-get upgrade”, only “apt-get dist-upgrade”
However I tried it in Mandriva and it seemed to work very well.
to metic:
you said – ” A good solution to avoid such difficulties in Debian system upgrades is to – guess what – have faster release cycle. When software isn’t too different from the software of the previous release, system upgrades are easier.”
I totally agree and think this is something that no one realizes. I mean I would love to see someone upgrad from a old mandrake that is a few years old to the new mandriva and see how well it does! So THANK YOU I couldnt of said it better….. excellent post!!! And since Debian is working/has worked on this very problem here is hoping it is all worked out now along with volatile packages being updated as well…
———————————————————-
To everyone on the “blame apt” wagon:
Hey, i know lets blame every problem on the fact that debian sarge took so long….shall we! We need a new package manager because sarge took so long (dang already done), hey we would have world peace if that dang sarge didnt take so long…
Go honk your SMART horn someplace else please….
I stopped using Mandrake after 10.0 .Just try to upgrade Mandrake 10 Community to any other Mandrake release 10.1 10.2 ,I dare you.
Or try to install the current Nvidia drivers in 10.1
The first command trying to update Mandrake is (urpmi urpmi) to make sure you get one that works.
Debian is a lot less fuss and bother than anything Mandrake or Mandriva has released after 9.2
Yes my information is dated because I just gave up on Mandrake.
mandrake loves to reinvent the wheel… I use to try and keep up with all the GUI changes each release and trying to figure out which tools actually worked and which didnt and so forth became a nightmare… Spent more time trying to figure out how to correct the problems that the GUI tools made or the config files they overwrote when I made changes that I needed…
then I LEARNED config files and fixing stuff the “hard” way and dropped those GUI tools and now I am a happy linux user!
Besides of Debian’s apt tools (I like especially commandline aptitude), another good package manager is Pacman used by Arch Linux and Frugalware. Though it may not be as advanced as the equivalent tools in Debian, the simplicity of package managemnt in Arch and Frugalware is also part of its strength. It is, for example, relatively easy to make one’s own working packages from source.
Many of the advanced aource package managemnet tools like Portage of gentoo or the BSD package tools are quite good too.
I read the Smart article, and although the idea seems quite promising and interesting, Smart feels quite complex too, and something very complicated usually tends to bring its own set of problems too.
“The first command trying to update Mandrake is (urpmi urpmi) to make sure you get one that works.”
Except, since 10.0, urpmi has automatically updated itself, rpm and related components before updating anything else.
“Debian is a lot less fuss and bother than anything Mandrake or Mandriva has released after 9.2
Yes my information is dated because I just gave up on Mandrake.”
Forget your information being dated, and learn to read. Claudio is new to Mandriva; he worked for Conectiva, and is therefore providing something of an outside perspective. Also, despite your ‘name’ – “urpmi or apt” – this has nothing to do with urpmi. Claudio is talking about the smart package manager he developed while he was at Conectiva.
I’m sorry, but i fail to see any good future for Smart. For me it’s clear the distro-packaging job has failures (as with anything), and we need to complement that solution with something else.
What isn’t clear to me is: if you have something broken, getting it double broken will solve the problem? I don’t think so, but that’s what Smart does; it just repeats the same problems across different packaging systems (problems which are inherent to the package-store design).
A second problem, and this one is more serious, is that mixing distro packages is definitely a recipe for messing up your entire distro. Distributions make pkgs to work on their distro, not on another; so they’re not distro-neutral by nature.
If we want a distro-neutral solution, i believe Autopackage is the way to look. No it won’t replace the native pkg managers, but it will complement them.
so how does “the recently touted problems” have anything to do with apt? or with smart? So you are telling me that SMART would of perfectly handled people upgrading a system that is over what… 4 years old? Can you show me users that are complaining about breakge?
Just seems to me someone wanted to blow a smart horn and decided on picking a random issue and going “see see look at those problems, they could of been solved by my new, beta, not quite ready but I will claim it is awesome new packager”
Seems to me the “recently touted problems” is about updating a very old system, NOT about cyclic dependencies…. which btw I havent run into….
man you are on a ROLL!!!! givin you your props, nicely stated…..
“I read the Smart article, and although the idea seems quite promising and interesting, Smart feels quite complex too, and something very complicated usually tends to bring its own set of problems too”
EXACTLY! Thats the problem with most of those great GUI tools, overly complicated and the problems go with it……
Smart is not designed to mix packages from different distros and its author never claimed that anyway:
http://zorked.net/smart/FAQ.html
“Smart is not meant as an universal wrapper around different package formats. It does support RPM, DEB and Slackware packages on a single system, but won’t permit relationships among different package managers. While cross-packaging system dependencies could be enabled easily, the packaging policies simply do not exist today.”
It is not because it is designed to support different package formats that it is suitable to install packages from different distributions.
From Claudio’s blog:
Recently we found a similar problem in Mandriva basesystem dependencies that has been ignored for quite a long time. Dependency consistency, along with rationalization of insane dependency chains, policy reviews, elimination of cyclic dependencies and overall optimization must be considered with great care if we want a smooth upgrade to Mandriva Linux 2006. Otherwise chaos will reign and we’ll be engulfed by the evergrowing entropy that grazes on rapidy evolving operating environments.
It about says it all! I think urpmi in the Mandrake 10.1 that I currently use is fundamentally broken (I found much less problems with 9.1 ). Now I have frequently encountered a dependency hell. Sometimes I have been able to solve the problem by searching out a long series of rpm’s on the web and empirically finding the right order to install them.
I recently attempted to install Mono on my system via urpmi the cyclic dependencies became so impossible I gave up on attempting to install it. I thought of compiling it, as there are of course some interesting new Mono apps. But I held off as I am currently teaching myself Java and have no immediate intention of learning C# and using the Mono platform. Anyway GCJ installed using urpmi without a problem.
Let us hope that Claudio now being part of Mandriva will help overcome this sort of problem with future releases.
PC-BSD’s .PBI is the best package manager I have ever seen, end of story. Just like Windows, you are presented with a nice welcome to app name installation….message, press next a few times and that’t it. Now honestly, that IS something. You also have a simular dialog to the add/remove programs dialog in Windows. Extremely easy to use, there is also “MyPrograms” folder where all the programs are installed in their own folders.
@joe: Thank you… ;-D
Now while I’m on a roll, here’s yet another point of view to this Smart thing:
Why is software updated? Often because of security fixes. Now, let’s say that I want to install some new net-related software to my server with Smart, and it – in order to fulfill some other dependencies – then wants to *downgrade* something that has had important security fixes… Would Smart be smart enough to understand what should not be downgraded? Maybe it should also include some smart mechanism to undestand that, at least important security upgrades should not be downgraded to less secure versions. But that again would, of coursse, make smart ever more comlicated. Feasible? I don’t know, maybe, maybe not?
An interesting project anyway. It might also suit Mandriva quite well… (sorry guys, but many Mandriva users often seem to be just the kind of persons who like to mix various package sources, install this RPM package from there, that non-official RPM package from over there, compile yet another program from sources and so on…)
heck my debian already does debs rpms and so forth… if I am crazy enough to mix them which sometimes I AM!
I doubt you can make a PERFECT package manager, but APT is about as close as I have seen…. and it is the one I will I have vested time in learning and therefore I wish SMART the best of luck in mandriva but dont waste my time with it. And I dont appreciate the fact that debians long sarge release was used as the focal point of extolling the virtues of smart.
ps-bsd is interesting but would certainly be bloaty but it may be the way to go… But imagine if you had xyz1.2 xyz1.3 zyx1.4 and zyx1.5 and a security flaw was found in xyz, now you have 5 exploits sitting there and 5 updates to try and get thru…. BUT still it may be the logical choice for easy independent software installs! not sure tho…
Oh and I no longer use mandrake becasue of dependency issues and not being able to install a large number of packages from a release without conflicts and those GUI tools that always vex me so as I said… all the best wishes for SMART on mandriva…
Oh and I dont think working at connectiva gives you a outside perspectiva’ considering they merged so easily it would seem that they where very similar distros that had a lot in common and a lot of the issues are around dependency issues in RPM based distros…
all package mamangers != apt-get
What about Portage or Arc’s package manager? They are still going.
unless people stop creating new formats every 10 minutes.
One needs to figure out a whole bunch of protocols and meta data systems. Basically work out what information a pieces of software needs to convey and protocols to convey it securely such that it can be installed, updated, migrated what have you.
This also requires configurations using a standard, elektra is a candidate.
Generally, XML offers a lot of the avenues, but silly people just start going on about slowness, then bring up s-expressions and other fringe stuff. The fact of the matter is capturing data in a very neutral way with validation for a variety of purposes including data and format, XML is the way to go. Not to mention thanks to ease of transformability, it’s patchable! Gasp, imagine etc-update in gentoo and all the other hacks for config file updates going away. *gasp*
Shared libraries are genius, the lack of protocols for a programme being able to clearly identify which ones they’d like to use (version number and build types and features of the build) is the issue. Zero-install and gobo hold pieces of this puzzle, the first using URIs to resolve names and the second, installing everything in its own directory, making identification and removal much easier. Not to mention you just have to traverse the file system to work out dependencies so long as the necessary info is stored
staticly compiled binaries..
Dependency hell exists because we’re dynamicly compiling everything against libraries that don’t maintain forwards and backwards binary compatibility with themselves. Applications either must link directly to their required version of the library and that library must be included in the application package or compile the libraries directly into the applications.
Staticly compiled binaries is a good idea but the execution time is slower. In my own personal opinion, it is worth it. I like the idea.
People proporting static compilation have an obviously limited set of mental test cases in whatever thought experiment they carried out.
You NEED to have a scalable solution. In fact it’s out right possible to have a solution that can scale from low-end desktops and PC like appliances all the way to massive networks and their populace.
If you don’t, you have man libraries that can have bugs or other security issues and you push the problem of updating down stream. This is exactly what you don’t want. In fact, a package management system should be able to address this.
Imagine you’re on a computer. You’ve recently told your machine to update, it downloads all that it needs and updates various pieces of software, right? NO, it’s not that simple. Some software will rely on a specific version of a library and you could well break it. Then again you might not. A package manager could give you the option of forcing and upgrade along a variety of concerns, be it per application, per library, per update. Mind you, the level of control someone chooses to have will depend on the interface provided. The fact that one can have that much control is what’s desired.
The thing about opensource is avoiding lockdown, but we’re locked into distros, where the hell are the open protocols and formats, that are truly open. Instead, we have half-baked ideas which are fracturing things.
in a perfect Linux setup IMHO the file system should be like http://www.pogolinux.com/ with human readable directories and programs having their own seperate folder instead of being spread over the system. For user/graphical applications app containers should be implimented.
sorry — meant http://www.gobolinux.org/
bsd is not 4 me.
while the gobolinux idea is nice for desktop use im not 100% sure that its usable for a server system as it may undermine some of the security ideas in unix. still, i may be misstaken if one say trow selinux into the mix…
the one thing that gobolinux do right tho is to allow for multiple versions of the same library to live in the same enviroment.
this again tho is most likely best for desktop and not server as you may be compriomising security by letting some apps exist that base themselfs on the old lib.
still, its one step up from the windows way of having 101 copys of diffrent libs/dlls all over the place.
basicly, dll hell and dependency hell is two variations on the same problem. and not even apple have solved it, they have just hidden it away by putting all files of a app into a special kind of archive file (or whatever you want to call it) so that you have all the “dll’s” in one place. you still have to find every last app that uses a dll version and upgrade them if you can rather then just upgrade one dll and be done with it.
the thing is to make the apps so smart that they can find the highest version of a lib it can use without failing.
then we need a os thats able to understand when a version of a lib is no longer needed so that it can discard it (or atleast shelf it until the user installs a app that cant live with any existing versions).
if i have 15 apps open and they all use different version of xyz that is some major bloat….
… IF the OS and/or the app is smart enough to realize when you need a different version then it MIGHT work but that is a lot of IF and MIGHT and it still has drawbacks….
TO me the whole thing woould have to be a totally NEW OS, to do this PROPER on linux would require one heck of some effort, a VERY smart package manager and a whole new type of package that was loaded with info about which version could/should be used and so forth and even then it is a matter of arranging all that to where the OS/APP would know about the different versions and so forth…
—-my main question—
But how is a app to know to use a version of a lib that is released later than the app? So your old app is using a old version of a lib but it SHOULD be using the newer one but it doesnt know to look for the newer version?
————————
from what i recall a app dont live on its own. to access a lib the lib have to be listed in a list that the kernel uses to find a partner for the app binary when it makes a call to a external library binary.
so what is needed isnt so much a new os or kernel but a new way to make library and app binarys.
in the .net framework microsoft attempts this by adding versioned interfaces to the librarys (or something like that) and the app binary knows the highest version interface it can use. then it goes looking for a library that have that version.
package managers dont have to become smarter, they only have to learn to allow for multiple versions of the same library to exist and allso allow for a reverse dependency check so that the user can see if a library is no longer needed and can be safely removed (if you are mixing source and packages or similar then said function is only partialy helpfull, unless you take the time to generate packages from the source and install that way).
basicly this is a area that have been more or less stagnant since the idea of dynamicaly loaded librarys. why? the current way works more or less nicely in a server enviroment where the installation and removal of everything is highly controled. but in a desktop enviroment it breaks down fast.
the biggest problem for the development of any new system today is all the legacy installations. the computer world have buildt up a nasty inertia and many a company have been crushed by the G-force generated when trying to change the direction of development.
but wouldnt the “list that the kernel uses” always have to be updated, and what happens if there is a security flaw in the newest library then how do you tell it NOT to use the newest one? of course, i guess you could “tell” the OS that the newest one is a security risk and it would automatically handle any apps that tries to use it! but you would still seem to have the problem that a old app will only “ask” for the old lib version because it doesnt know a newer one exists.
hmmm…
“to access a lib the lib have to be listed in a list that the kernel uses to find a partner for the app binary when it makes a call to a external library binary”
so how do you propose that this gets handled?
what list? how is list updated? who is the list provider?
what kernel handles this, how do you create a method to handle this? every app is rewritten to send messages to the kernel that describe what it is and what it needs every time it starts? how resource intensive would this be?
i just see it as being different, not neccesarily better, just different problems…. Even on a desktop system I think it would still be troublesome but possibly less troublesome than it is now….
These *-core packages are nice well when you know how it works. But saying “I am going to uninstall gnome” is a bit misleading and confusing for the user. “Huh? I don’t want to uninstall gnome! No!”.
Argh, I’ve replied to the last comment on the first page, thinking it was the last comment on this article. /me stupid
DOH! at boto
yes there will be a performance hit, but it may well be worth it if it can kill the dll/dependency hell problem and it will only happen upon loading/first access of a library (and when its first loaded i suspect its not unloaded until every app that use it is stopped).
if you dont like it then turn it of and deal with it yourself, thats what distros like gentoo and slackware is for
🙂
debian for me, i am old and set in my ways now i reckon, libranet as well…. ok, even xandros is cool
tho using vector linux for a bit has me interested
nice thread tho….. hate to see it go on the back burner