“The Smart Package Manager hopes to beat the native package management applications for distributions like Red Hat, SUSE, and Debian at their own game. Still in beta, it has support for most major GNU/Linux package and repository formats, with a modular codebase that hints at further compatibility. Smart introduces many innovative and useful ideas, but its killer feature, with which it purports to excel beyond its counterparts, is the algorithms it uses to select packages and versions that best resolve dependencies and ensure cooperation between the hundreds of applications and libraries on a user’s system.”
Smart Package Manager: a Better Mousetrap
About The Author
Follow me on Twitter @thomholwerda
2006-07-14 2:39 pmagentj
IMHO it’s all about GUI. Unix tools tend to have crappy GUIs compared to the Windows. Most things that sucks in the Unix configuration UIs that they are only graphical representation of the .conf file in the speicified tool (you still need to know how does the config file work). Worse if the application doesn’t have native GUI and it’s just called from the windowed application – it eats the resource, because instead of using library directly, you have to spawn child process and mess with pipes, parsing data and other unnecessary stuff.
I prefer the way apps are installed on Windows. Run .exe or .msi installer and it will install everything. I hate that you have to care about every god damn version of the shared library. I think it would be better if there was standard API for GUI. WinAPI isn’t the best but it just works and it’s stable most of the time – no matter which Windows version do you use – 98 or XP. If new versions of libraries could provide compatibility layer for old versions, there would be virtually no problem. E.g. I don’t have to worry if the latest WinAMP requires version a.b.c.d.e.f of library XXX to work and if there’s a.b.c.d.e.f-1 installed, it won’t work. Many tools use cryptic names for their functions, e.g. “New channel” instead of “New remote package source” or something more readable.
Back to the Smart: I think that the GUI consists of too many options for the average user. I would prefer to have just Uninstall/Repair buttons. Installation from the package manager itself sucks. IMO installation should be done just by double-clicking the installer file. Additionally I would prefer applications installed to something like Program Files – more clean. Most users just click Start->Programs->program name – they don’t even look at Program Files directory.
Edited 2006-07-14 14:39
2006-07-14 3:22 pmSphinx
I’m sorry but it is *not* a graphical operating system, that is it’s greatest strength, no gui required.
2006-07-14 5:13 pmBending Unit
With this attitude, Linux will stay in the 1960.
2006-07-15 3:30 pmSphinx
It would have to be there at least once before it could stay there.
2006-07-16 4:27 pmbuff
I use Fedora and once a week I type ‘yum update’ to check for new packages. It is so easy. No GUI required. There is a GUI for yum called yumex but I don’t use it.
CLI is an advantage. You have to becareful of trying to make Linux exactly like Windows with a GUI for everything. If users need to learn the Bash shell a little bit then that is okay with me. Linux is clearly not still in the 1960’s. I have transparency effects enabled from X.orgs compositor extension running and Vista hasn’t even come out yet! Linux is actually ahead in many areas of technology. Another Vista feature: permissions, user roles and access have been a part of Linux from the very first. Vista is now only catching on. Belief that Windows is ahead is a myth. Windows has better marketing, larger market share, and better driver support mostly as a result of being a monopoly. There is also much better malware and spyware for Windows. 😉
Edited 2006-07-16 16:30
2006-07-14 5:43 pmLatem
IMHO it’s all about GUI.
Ok, GUI’s are important for novice users, and are nice. But sometimes a command line interface is just as easy, helpful, intuitive, and indeed necessary.
Unix tools tend to have crappy GUIs compared to the Windows. Most things that sucks in the Unix configuration UIs that they are only graphical representation of the .conf file in the speicified tool (you still need to know how does the config file work).
Most GUI tools are front ends to command line tools, and files. In *nix everything is a file, so you can’t really get away from this. Actually in windows too most config GUI’s are frontend’s to files as well. It’s all about abstraction. Please name me a config tool where you still need to know how file works. I use Kubuntu, and I am able to set up everything from the System Settings menu, and trust me i have no idea how most of those files (ie. x.org conf file, audio files, etc), work. Seems the GUI is working fine for me.
Worse if the application doesn’t have native GUI and it’s just called from the windowed application – it eats the resource, because instead of using library directly, you have to spawn child process and mess with pipes, parsing data and other unnecessary stuff.
Using a library and the GUI application doing everything is a good idea sometimes. It could result in a better user experience, and overall seem like a better “integrated” tool. But still having two processes isnt that big of a deal. IPC is pretty effective and fast. There is a lot of IPC going on in your computer all the time. In fact it could be argued that separating GUI and the command line is better design. It separates functionality and purpoce. Plus if the GUI has a bug and/or it messes up somewhere, it could be potential safer and more secure to have two separate processes.
I prefer the way apps are installed on Windows. Run .exe or .msi installer and it will install everything. I hate that you have to care about every god damn version of the shared library.
Then if you are using *nix you are using the wrong OS my friend. Windows system is terrible actually. Think of it this way. You install application MyApp made by MySoftware Company, that uses library libFoo, made by OtherCorp, statically, or somehow integrated into their software . All of a sudden, there is a serious flaw or security issue in libFoo. OtherCorp fixes the issue immediately. You are still at the mersy of MySoftware Company to get libFoo in their application MyApp, and release it for you. Until this happens you have a broken system. In Linux you get the fix ASAP.
I think it would be better if there was standard API for GUI.
What do you mean really? Qt/KDE is pretty much the standard for KDE environment. GTK for GNOME. It is pretty easy to stay within the confines of one of these two.
WinAPI isn’t the best but it just works and it’s stable most of the time – no matter which Windows version do you use – 98 or XP.
I am sorry but WinAPI and MFC is pretty much one of the biggest abominations in computer history. I have to work with this crap every day, trust me, Qt is muuuch better. The reason it work on older windows is because its patchwork after patchwork after patchwork from 16bit days.
I don’t have to worry if the latest WinAMP requires version a.b.c.d.e.f of library XXX to work and if there’s a.b.c.d.e.f-1 installed, it won’t work.
I already addressed this issue.
IMO installation should be done just by double-clicking the installer file.
Yes, but first you have to purcahse it at the store on a CD, or find it on the net, and hope you are not downloading a virus.
Or you can open a program and click on which application you want, and it’s downlaoded and isntalled for you. This seems easier to me.
If you really want single install package PC-BSD does it, similar to windows.
Edited 2006-07-14 17:46
It seems that the smart package folks proud themselves of creating a modular application but still created an application which offers both a GUI and a command line interface.
This seems to be rather silly. Why didn’t they learned from apt, for example? The apt package manager is a great tool and, because it is a pure command line application, it leaves the responsibility to offer a GUI to whoever wants to do it. Apt enabled projects like Synaptic, Adept and the excellent addition ot Ubuntu, the Adept auto updater, which are all great tools.
So where’s the need of being silly and write a not so modular application? It seems like it was a poor work investment.
2006-07-14 1:04 pmnicholas
Excuse for appearing stupid, but can you please point out to me the difference between SmartGUI and Synaptic (implementation-wise that is).
2006-07-14 3:12 pmCapEnt
Synaptic is a front-end for libapt wrote in C, it is not only a front-end for apt program, in fact it do not require the command line apt program at all, synaptic is integrated in a quite low level with libapt by itself. So synaptic uses standard apt “way” to solve dependencies.
Smart is a new package management system with no relation with apt, although it is compatible with apt repositories. It also a quite universal system, supporting several types of repositories along with apt, in fact one of project goals is to unify package management systems. Smartgui is a python-gtk program (makes sense since the core of Smart is in python), and is fully integrated with its core. Smartgui uses a new algorithm to solve dependencies who can avoid some packages to be needless installed, downgrade packages without damaging dependencies (sometimes downgrading other packages automatically as well)… among other niffy things.
2006-07-14 1:09 pmMonkeyPie
It IS modular. It is not necessary to install the SMARTGUI at all and other interfaces can be created to be used with the SMART command-line tools. The SMARTGUI just runs as a front-end to the command-line tools.
Why not offer a gui for their command-line tool? What’s silly about this? They created an awesome tool… originally designed as a CLI. But for those who don’t know or understand the CLI they created a GUI for those who need it. It’s not like they locked anybody out of creating their own front-end.
2006-07-14 3:08 pmkwanbis
we are in 2006 … Command Line Tool? Why?
2006-07-14 3:20 pmSphinx
For the same stupid reason you turn off your car engine before rebuilding it?
2006-07-14 3:23 pmel3ktro
Have you ever heard of remote administration? Or scripting? Even MS realized this and promotes Win2k server with its command-line & scripting abilities.
2006-07-15 3:38 pmSphinx
While this article has absolutely nothing to do with MS or Win2k or windows of any version, remote administration is a UNIX invention, thank you. Only took MS twenty years to finally steal the notion and they haven’t even come close to getting it right even in their most recent attempts. If you think so then you’re just not trying it over a slow enough connection, try dialing in. I managed a UNIX network of 166 servers scattered over 43 states with only four on 16k frame relay and the rest by hayes 9600 baud with nothing more than UUCP and the basic network utilities so it’s a pretty safe bet I’ve probably written more script than you will ever read.
2006-07-16 8:30 amCloudy
remote administration is a UNIX invention, thank you.
Um, no, but thanks for playing. IBM had remotely administratable systems in the early 70s, as did CDC. I’m not sure what the first remotely adminstratable system was, but it wouldn’t surprise me if it were SAGE.
2006-07-15 1:15 amtroc
cli because it is extremely flexible yet simple. These seemingly opposing characteristics come together at the command line interface.
Anyone know if/when Smart will support SLED 10?
2006-07-14 2:37 pmczubin
just install smart package from opensuse on SLED, it also gives extra opensuse repositories for missing crucial software(like eclipse, apache etc) in SLED
Honestly, I tried it out on SuSE 10.1 and would just rather use APT… Sure there are dependency problems once in a great while, but software installs would be too easy without that 30 minutes of looking for that one lib, lol
Pardon my ignorance, but I’m fairly sure a modular package manager like this will lead to lots and lots of broken systems. Installing packages for different distros will simply be too tempting, and most users don’t realise just how much linux binaries depend on their libraries being built in the same environment. In addition to that, a package involving any kind of distro specific scripts is not going to work on other systems.
2006-07-15 4:01 amCelerate
There is the possibility of implementing a filter in which the package manager is configured only to take packages built for a specific distribution. If need be I’m sure it could also be made only to work with a specific version. In fact this is already possible as a trick with the RPM and DEB package managers.
Deb and RPM packages allow the packager to specify required prerequisites, everyone whose used them aught to know. What can be done is building a package that takes the name and version number of the distribution, and then all other packages could have that as a dependency. I think it’s actually a pretty good idea, and to make the empty package more meaningful it could contain a script with a common name amongst all distributions that will print out the full name and version of the distribution.
Of course that won’t stop a distribution that implements the idea from installing a package from another distribution that doesn’t implement the idea; but, if the majority of packagers adopt the idea then there will be fewer packages that will install on an incompatible distribution without being forced.
I just tried this smartpm on debian etch and it doesn’t work.
I can smart update, but when I smart upgrade it eats 100% of my CPU while doing the transaction check. I’ve let it got for 15 minutes, just in case, and then killed it.
I tried a couple of times. I guess apt is still the tried, tested and true package manager for debian.
is pretty dumb.
One of the reasons why we see so little real progress from the linux/oss crew is that no one takes the time to fix existing systems, since they’d much rather reinvent the wheel than work with a legacy code base.
Why fix it when you can write your own version that’s broken in a different way?
> Installing or upgrading packages is not always a simple
> task, and it is not uncommon for Debian’s APT or
> Fedora’s yum to err in their attempts, causing broken
> packages or unmet dependencies. While Smart is not
> infallible, it is designed to overcome such issues and
> deliver to users the best possible systems, even if the
> most obvious route cannot be taken.
Why invent something like Smart then? Why hide the problem instead of fixing it? If a package manager has to *guess* dependencies like it is described in the article then there is something seriously broken.
Package management is not simple? Then make it simple.
Ironically, it looks like the Smart Package Manager got a big boost from the mess that happened with SUSE 10.1 and it’s package manager. So many SUSE users simply abandoned SUSE’s YaST and switched to Smart, that it has probably got a substantial increase in it’s user base because of this. On my SUSE PC Smart works great; it’s fast and seems to return far less error messages than YaST. SUSE seems to have fixed most of the problems with YaST, but Smart is still the preferred package manager for many. If Smart manages to become the default choice across the other main distros as well, this unity would probably be good for linux as a whole.
What smart needs is a good, clean GUI. The current one (or at least the one that shipped with suse 10.1), was simply unusable for me, and I’ve used most package managers.
I think that’s currently the only “major” problem with smart. But if they make a good GUI, with updater applets for kde and gnome, they have a very good change of being THE package manager for most distros, and that could be a very good thing (TM).