The Smart Package Manager project has the ambitious objective of creating smart and portable algorithms for solving adequately the problem of managing software upgrading and installation. This tool works in all major distributions, and will bring notable advantages over native tools currently in use (APT, APT-RPM, YUM, URPMI, etc).
Looks interesting, I think I’ll give it a try on my slackware box.
Yeah it looks interesting. Development of this was funded by Conectiva, the same guys who gave us apt-rpm and synaptic… The only thing is, where do we get it from? I couldn’t find any download information.
http://zorked.net/smart/
This looks like a hack to try and make bad setups with multiple incompatible repositories work. It doesn’t look like it would bring any benefits at all on a properly set up system – where all the repositories are verified to work with each other (for instance, a debian machine using official debian sources, or a Mandrake machine using official Mandrake sources). So maybe this thing can help you prolong the inevitable demise of a system that uses multiple incompatible repositories and packages that weren’t created for it by a few months. No thanks, I’ll take a pass, and keep using my official sources.
First we make so many bad package managers and then we have to write a tool to manage all these bad package managers. Then we make some more package managers and enhance these tools to manage them and interop with previous bad package managers and then…the code base is getting huge without much value…
Who need all this…can’t there ever exist a standard way of doing this and people give up their ego and improve the standard?
Hey if i write software for Linux, now i have to package it in N different format, test on M different distributions. Holy Crap! Its the sheer waste of development time in mad mad Linux world.
I have been using FreeBSD and Slackware for some time and tried some others disros too.
I also developed one script, a front end, that create Slackware and FreeBSD packages automatically using the native system package manager.
After an email to Pat and digging a bit, I stopped improving it (even though I still use it). Why? Because the problem to package software has less to do with its creation/dependences/tracking and more to do with conformity with the whole system and chose/setting of options and scripts they all need.
That’s why we have so many complaints on “it’s not working on my DistroLinux x.y.z!”. When the maintainers of one distro chose compile options, package options, patches to be applied, directory locations and init scripts, they pretty much define the behaviour of it.
So, or all distros agree to store all this information in a standard way that can be easily recovered, or will be better to stick with the packages that comes with whatever we picked and live with our choice.
Of course, I can be completely wrong.
Exactly. This package manager looks like it will make it a little easier to try and shoehorn Fedora packages into Mandrake, or vice versa, or use fifteen thousand unofficial yum sources together, but that’s just never ever going to be a viable way to keep a stable working Linux system. A distribution is a *distribution*, after all – a collection of software that is engineered and tested to work together. Something like this is the solution to the wrong problem.
wolf – or you could write your software, release the source codes, and let the distributors package it themselves. They’ll do a better job than you will, and you have more time for golf…
a HUGE waste of time. Instead of trying to fix the mess that is Linux package managers, how about cooperating on one, distro-neutral packaging system?
Seriously, Autopackage is the only project that stands a chance of making a major impact.
Hey, just noticed that Mike H wrote some notes on making software installion in Linux easier.
http://lists.sunsite.dk/cgi-bin/ezmlm-cgi?21:mss:844:blhclagmoleeje…
Wow, finally someone with a clue.
Instead of trying this, give http://www.pkgsrc.org a shot,
it works on most unix-like systems, and has a wide range
of supported packages (about 5000).thanx NetBSD for making my life easier
looks like something progency would use
SMART isn’t just about unifying packages from various distros. A large part of the work is about improving the fundemental algorithms driving the package manager. The case studies posted comparing SMART to APT and YUM are very illustrative. I agree that a better package manager can’t do anything about packages from different distros that are fundementally incompatible, but I think improving the basic algorithms could be very good for stuff like “unofficial” repositories of Debian or Fedora packages.
The case studies precisely fit my point, actually – they’re all doing things that would never crop up if you’re using a proper repository system in the first place. They’re all situations that would only occur if you have fifteen zillion repositories providing, say, different versions of the same package, or different packages for the same libraries.
Using unofficial repositories is just asking for trouble. It’s one reason I’d never use Fedora – there’s too few packages in the main core which forces you to go outside for vital stuff. With my setup I know that every single frickin’ thing in my system directories is from a Mandrake Cooker package built against *the same packages I have on my system*, tested to work with them, and signed by MDKsoft. Thanks to that, I haven’t needed to do a fresh install on my laptop system since mdk v9.0. No matter how smart the dependency resolution, if you’re going to rely on multiple ‘unofficial’ repositories you’re never going to achieve this.
Totally agree with you. One thing is package a non vital stuff, what can be easily accomplished, another very different is try to supply something that is very basic and customized to work properly taking in account the distro specificities.
If they want to supply a good tool to be used in the 1st kind of software ok, can be good, albeit I will stick with the official ones if they are available anyway.
I disagree. The second two issues could pop up in a single repository, and, though rare, I’ve seen similar things happen in Debian. Also, I think one of Debian’s greatest strengths is its ability to handle multiple repositories. If all software distributors had an APT repository, software management would become truely simple. For example, you’d never have to check a website for a new version of your favorite program again. I think it’s foolish to try to ignore that powerful aspect of APT.
“The case studies precisely fit my point, actually – they’re all doing things that would never crop up if you’re using a proper repository system in the first place”
problem is that a central repository goes against the distributed nature of software in Linux. check autopackage mike’s link posted above
Perhaps we are looking at slight different perspective. I’m nothing talking about regular stuff, but system ones and why you should rely on certified ones. Even kde, and with a little more effort gnome, can be build without too much trouble (the first one I did).
But one day just try to upgrade glibc, gcc, heimdal, openssl, openssh, pam, heimdal, samba, postfix, netfilter, isc-bind, isc-dhcp, openldap and other system packages from non-official repositories. Chances are that you are looking for trouble. There are too many settings to glue them all together that in my opinion is foolish to re-certify the whole dependencies.
Again, my point is, many of the problems they are trying to fix are already corrected on the native package managers they listed. If it doesn’t exist on official repositories I could use their tool, but just in this case.
If you like broken binaries, irreversible package breakage etc, then this is the tool for you. Its clear this *cannot* work. Take one package from package system X and repo Y. How does Smart know that it doesn’t clobber package P from system X’ and repo Y’??? IT DOESN’T and CAN’T. If you are going to tell me this can already happen in yum or apt if you have two repos that have differently built versions of the same package, well of course, but the chance of damage is much much higher once you begin mixing entire packaging systems.
Better to find ONE apt or yum repo that works for you and use it EXCLUSIVELY. preferrably one that supports kernel upgrades.
ANYTHING else and you are literally rolling the dice with each update.
…make me glad that I’m not using apt or urpmi or any of them.
Portage suits me; it’s not perfect, but I’ve never been confronted with a situation like those.
Some parts of this new system do look very slick, but it still looks bloody complex – that “New Channel” dialog is enough to put me off!
No, the world wouldn’t suddenly become a magical utopia if everyone used the same package system. I’m sorry, but that’s just a myth. The package system and the manager don’t matter to this kind of thing; what *matters* is the level of similarity between the system on which a package is built and the system on which it is installed. If Software Author Bob writes his package and builds an apt package of it on a buildhost running Mandrake 9.2 with an updated GNOME from Random Third Party Source X, an updated version of Evolution from Random Third Party Source Y and some other stuff from Random Third Party source Z, is that package going to work well on Software User Joe’s system? Software User Joe has a system based on Debian testing with GNOME updated from Random Debian Source F and a new, I dunno, gaim from Random Debian Source G, maybe he changed some other stuff.
The package is an apt package. Debian will do its magic and try and install it. Is it going to install? Probably not. Is it going to work? Hell no. Will this thing solve the problem? Well, maybe it’ll help it install…
portage doesn’t ‘work’ because it’s a magical package manager, it works because of the sources. (Plus building things from source generally makes them *more* tolerant of differences in the buildhosts – if you install a binary RPM for one version of Mandrake on another, it may well not work. Get the .src.rpm and rebuild it and you have more of a chance). As I keep saying, the package manager is *not* the problem here. The problem is how you handle your sources. If you were to pick a Linux distro and use its package manager only with the correct official sources for that distro, you’d have no problems.
@anonymous (61.95): what crap. As I said, the *definition* of a distribution is that it’s a centrally-built collection of software designed to form a cohesive whole.
“As I said, the *definition* of a distribution is that it’s a centrally-built collection of software designed to form a cohesive whole.
”
people dont care. they want the distro to be a platform not an appliance. i dont want to rely on my particular distro to package the world. if i wait for that the result would be something like debian. big distro. too slow moving.
Why windows is so successful. Get a binary anywhere and it will install properly. Thank god they got rid of DLL hell.
Portage is the best package system around
Portage has its problems. Don’t be a zealot
what we should have is a way for a package system to define variable folders for files so that if distro x uses folder xx for storage of usr while distro y uses yy for it then one can have a config file that sets this and when installing a package it dont define the path from / on out but from the variable folder on out if you get what im saying.
as for the binary, merge source and binary install into one package so that if everything else fails the binary can be compiled from source with out the user haveing to find a seperate source package. yes this means bloat but you cant have both flexiblity and a lean system at the same time…
basicly a merge between rpm/dpg and gentoos system.
hell, with the right variables set distroside the package may even come with diffrent binarys for diffrent distro releases so that it can fit those atleast. again bloat, but thats flexiblity for you…
Gentoo already provides the feature’s you’re asking for. For most large packages like GNOME, KDE, and OpenOffice, you can “emerge” the binaries. You don’t have to compile everything.
Unfortunately, binary packages don’t exist for everything, so you do have to compile some packages (the most noticable one being the Linux kernel).
It has also done a lot of work on parallel “profile” environments, sandbox installs, and architecture support for alpha, amd64, arm, hppa, ia64, m68k, mips, ppc, ppc64, ppc-macos, ppc-od, s390sh, sparc, x86, x86-obsd, x86-fbsd, x86-od.
If binary packages were included for most packages, ebuilds were digitally signed, and a few other niceties were added, Gentoo could gain a much larger mindshare. VidaLinux is trying to do some of this work, but haven’t succeeded in building a big enough user community. And unless there’s an ebuild that can allow you to emerge LSB 2.0 support (or even LSB 1.1 support), Gentoo is a non-starter for many companies. (Yes, I hear the replies from the Gentoo community that the LSB goes against the Gentoo philosphy and that it’s unecessary for source based systems, but there are many “unecessary” things in the world that exist for the sole purpose of helping interoperability.)
Personnally, I think that we should lock the DEB people, the RPM people, and the Portage people together in a room and don’t let them leave until they come up with a common package format and package distribution policy. Each packaging format has its advantages and disadvantages, but I don’t think there’s any technical reason why a common binary/source system couldn’t be created.
This wouldn’t solve the foreign repository issue, but it would at least make it more manageable since we could set up the policy to use binaries from the “official repository” of a distribution and source files from “foreign distribution repositories” (so that they can adapt to the configuration of our distribution).
IMO, the “Smart package manager” is trying to solve the wrong problem. It will eventually be needed, but unless the “foreign distribution repository” issue is solved, all Smart PM will do is to eventually get you into trouble.
heh, DLL hell is alive and well on many reasonably well-used Windows systems. Windows doesn’t have the magic package management bullet, btw. It has static binaries.
This software, although built like another layer on top of the existing package managers is better than one you mentioned since it does not need new packages to be built and distributed. That’s a huge saving on packaging time, IMHO.
Don’t even act like it happens a lot. I have used every Windows Os since 3.1. And never had the problem.(I test all sorts of software) Why becuase MS has a standard to each os. Software makers know/assume the os will have it. Most of them time it does. Xp even detects if one dll has been alter or remove and reinstalls them. So its very rare. I can’t remember the last time I heard someone saying they can’t install because of a dll missing. Maybe becuase of a bug. Even some software come with extra dlls that it can fall back on if needed. And before you think I’m a MS supporter your wrong. I just switch our 100 and some pcs at work all to Ubuntu (debian). And got a raise for it. MY boss loves saving money. I use Ubuntu and stand by it.
I wouldn’t say it happens a lot, and I would say that it’s absolutely possible to run a Windows system well and have it *never* happen (just like you can run a Linux system well and never end up in RPM hell). But it certainly can and does still happen.
Yes it does happen just not as frequent as an RPM distro. Like I said I never had it happen but of course I haven’t tried all software and might not came across that one software the replaces an outdated dll and screws up the whole system. One of these days dependecy hell will be solved, and thats one more step towards Linux being main stream.
You use proper software installation techniques on Windows (i.e., don’t install every piece of crap you come across). You never experience DLL hell. I use proper software installation techniques on an RPM-based distro (use official vendor repositories, don’t install every piece of crap random RPM you come across) and I never experience dependency hell. The future is now.
If you’re using Fedora Core 3, just go to:
http://dag.wieers.com/packages/smart/
try it out and agree that it’s better than apt and yum.
Sure it has more features, but it outsmarts apt or yum even without these extra features. Simple, clean and elegant and this is just the first public release.