Linked by Adam S on Wed 13th Jul 2005 12:56 UTC
In the News The GNU Source Installer is "a graphical tool which provides configuration, compilation, installation, tracking and removal of source packages." The authors have written about their project on Newsforge.
Order by: Score:
You have to respect the author
by smitty_one_each on Wed 13th Jul 2005 13:13 UTC
smitty_one_each
Member since:
2005-07-07

From the documentation:
2.2 Why should you avoid this program?

The following is a critique against this tool, that shows what you lose, or do not gain in contrast to relying on the good old command line:

you can lose time.
sourceinstall basically installs two times. However, the install process is generally not so time-consuming, for example in contrast to the compile process. The program makes a test installation first (if DESTDIR is supported), to make a final check and gather useful information. This has an impact on total installation time.
it is not scriptable.
sourceinstall can be called with filename parameters, but during installation dialogs can be popped up if decisions have to be taken. This is a tool tought for interactive use, so if total automation is your goal, sourceinstall is not for you.
some packages might not work.
sourceinstall has to make some generalizations and will not be able to install difficult packages. An experienced user or developer can quickly go through a broken or sketched Makefile and fix things for his system, but sourceinstall can not. On the other hand, if enough people use this tool, this could further drive developers towards the autotools and to create better packages in general.
no package-level dependency tracking, no repository, nothing at all
If your package blocks during configuration, for example, you still look at that damned error message in the pseudo-console and act consequently (generally this involves browsing for that missing file/package).
people who install everything from source might not give a damn.
After widespread adoption, however, even less experienced users could be able to approach the source packages (and hopefully become more experienced with time)


---------------------------
My big question is, if I love Gentoo, how does this improve on emerge?

Reply Score: 1

RE: You have to respect the author
by JrezIN on Wed 13th Jul 2005 13:25 UTC in reply to "You have to respect the author"
JrezIN Member since:
2005-06-29

Looks like if you use Gentoo, it's not so useful for you.
But this tool may be useful to lot's of other people. Mostly, the ones learning how to install for source instead of disto packages and learning how build tools work... It's more useful for then than automation process (which they could learn to use in the future with standard build tools). It's a step-by-step tool, will help people learn and not feeling so lost in the beginning. I'm sure that lots of slackers, mostly beginners, will like it.

Reply Score: 1

Is good ,but...
by Budd on Wed 13th Jul 2005 13:20 UTC
Budd
Member since:
2005-07-08

As a Slackware/DLG user I use http://gnome-pkgtool.sourceforge.net/ And I don't have any problems.That's quite obvious since I mostly use .tgz packages (pkgtool). Of course,I can use pkgtool itself but nowadays I rarely go CLI.Maybe when I want to impress my neighbors with my skills.Which frankly,never worked.

Reply Score: 1

Anonymous
Member since:
---

Why a standard set of base libraries CAN'T STILL be achieved? That would mean standard, cross-distro, GNU/Linux binaries. Although a bit late (XXI century here).

Installing from source is just not an option for the desktop user, so a frontend to the autotools is something totally useless IMHO. And when i want to compile something i go with checkinstall, btw.

Reply Score: 1

orestes Member since:
2005-07-06

The problem is that, for all intents and purposes, Linux doesn't exist as an OS. Each distro has its own self-contained control structure and its own distinct goals.

Reply Score: 1

orestes Member since:
2005-07-06

It occurs to me that perhaps I need to elaborate more fully.

Each distro is operating as a more or less independant OS, drawing from a common pool of open source software and modifying it to suit its particular goals. These modifications often contradict each other. Trying to merge them into a single, universally compatible tree would be a herculean task.

Reply Score: 1

dukeinlondon Member since:
2005-07-06

Why a standard set of base libraries CAN'T STILL be achieved? That would mean standard, cross-distro, GNU/Linux binaries. Although a bit late (XXI century here).

Very strange indeed. I am kinda losing patience myself...

Reply Score: 2

binarycrusader Member since:
2005-07-06

Why a standard set of base libraries CAN'T STILL be achieved? That would mean standard, cross-distro, GNU/Linux binaries. Although a bit late (XXI century here).

There is a standard set of base libraries, just not all distributions choose to use them.

Additionally, the standard set that exists doesn't include anything GUI, because the lirbaries for GUI stuff aren't stable enough to be part of a base set.

Reply Score: 1

Anonymous Member since:
---

Yes, indeed. Cross-distro binaries are more badly needed than graphical source package installers. Closed-source deployment on Linux is a major hassle. A look at the Opera for Linux download page demonstrates the insanity.

Source packages usually already work, "tar", "configure" and "make" are as cross-distro as possible. While every new program is a worthwhile addition, I don't really see how a graphical *source* installer should bring the project Linux-for-the-masses forward.

Reply Score: 0

ma_d Member since:
2005-06-29

The reason Opera's download page for Linux is "confusing" is because Linux users are picker: If you sell a windows user software with all your libs statically linked: They don't care or notice. If you do this to many Linux users: They complain and request one without the libraries because they have those installed already.

It's always always always going to be easy to statically link your libraries into your program for distribution on Unix, OS X, Windows, and any other operating system with good linking utilities.

Free software never has, and never should, put up with these bad distribution methods. We buy our RAM for data: Not 18 copies of the png decoding methods.

Reply Score: 1

Budd Member since:
2005-07-08

Why a standard set of base libraries CAN'T STILL be achieved? That would mean standard, cross-distro, GNU/Linux binaries. Although a bit late (XXI century here).

And how do you want to do that? I find it to be much more simple like that : ./configure && make && make install

[i]Installing from source is just not an option for the desktop user, so a frontend to the autotools is something totally useless IMHO. And when i want to compile something i go with checkinstall, btw</>

This is very true ,but question is , how much the desktop user MUST know. Does he/she have to install new software? Because if it is so,then we are talking about acustomed users with the OS in cause.For example my sister,she uses 100% Windows , yet she will have no problem in USING a Linux distro. When it comes to install in Windows I just tell her double click on that (or unzip that) and accept defaults ;)
I am sure she will have quite some problems installing FROM SOURCE in a Linux distro. Is for that software like this one mentioned here makes sense. Is not about cross-distro binaries. You know how to do it or you don't.It may be harsh but users must learn as well.

Reply Score: 1

Anonymous Member since:
---

Why, you ask? Very easy. I want the latest, my neighbour wants the tried and true. Which one are you going to put into your "base set"?

OTOH, cross-distro binaries do exist already. You don't download Oracle for Suse and Oracle for RH. You just download Oracle for Linux/x86.

For quite self-consistent binary packages, depending on C an C++ libst, it is possible right now.

For open-source packages, which depend on a lot of libraries, it is better your distro maintainers compile it and make sure it plays nice with other packages.

I, for instance, am satisfied by how I install software on my linux box.

Reply Score: 0

As long as they keep it simple
by Kick The Donkey on Wed 13th Jul 2005 13:51 UTC
Kick The Donkey
Member since:
2005-07-06

Don't try to engineer the kitchen sink here. Keep it simple, with few options. If someone is an 'experienced' linux user (and needs more options/control), they'll probably drop to the cli, anyway.

With that said, I completely agree with the sentiment here: "On the other hand, if enough people use this tool, this could further drive developers towards the autotools and to create better packages in general. "

Reply Score: 1

no thanks, please no
by l3v1 on Wed 13th Jul 2005 14:37 UTC
l3v1
Member since:
2005-07-06

I don't need no place-hogging, slower, bulky, bloated graphical installer thankyouverymuch. All I need is the level of speed and reliability of dpkg-like package management systems, with an easily usable graphical frontend eventually for those who can't live without graphical everything. While newly turned newbies could be made happier with such installers, I really couldn't care much less about them and I strongly stand on the side of not making them universal/ubiquitous.

Reply Score: 1

Sourcenix - my theoretical system
by aaronb on Wed 13th Jul 2005 14:44 UTC
aaronb
Member since:
2005-07-06

Hi there,

very quick thoughts and my made up system

A new system sould setup a
/ect/PKG/source_applications -- For apps installed for source
/ect/PKG/binary_applications -- For apps installed not from source
/ect/PKG/scanned_applications -- Apps and libs that do no conform

Note a data base is not kept in any of these folders instead it contains information on the apps that have been installed. Instead each app will place a conf file that states, what files have been installed, what it requires to run, the name of the app.

I could launch my Source install GUI and it will ask me if I would like to remove or add in app. All I need to do is select the app that i have downloaded in source form like KDE.src.bz2 and it will un bz2 it and run the config script. If there are dependencys that are needed it will offer the following...

After scanning the system this app requires the following packeges to compile and run correctly whould you like to..
1. About this operation
2. Find the packeges and install them with the appyou have selected (Recommented)
3. Ignore allpossible errors and try to installit anyway (Not recommented)

And if it could not find the deps it will say...

Unfortunatly, not all the deps could be found. The following options are avalible.
1. Abort this operation
2. Abort this operation and send to report of the deps that were not found
so we can investigate this to reduce this form happending again (Recommented)
3. Ignore all possible errors and try to install it anyway (Not recommented)


Once KDE is installed it will place a config file name in this case KDE_3.4.1_installed in /ect/PKG/source_applications.

If I then wanted to remove KDE i would go to the Source install GUI and pick remove. It would scan the three folders to see if the would cause a problem and give you three options if there is a problem...

Removing KDE will cause the following apps to stop working. Would you like to
1. Abort this operation
2. Remove KDE and all other apps dependent on it (Recommented)
3. Remove it and ignore the possible melt down. (Not recommented)


For developers who hate to work together the Source install GUI will scan the system and place entries in /ect/PKG/scanned_applications. This would not be 100% accurate but better than nothing.

We could go the OSNEWS forums and make our system and write a tool to do this, create a website and spread the word.

Binary install would place an entry in /ect/PKG/binary_applications to make them selves known
or be scanned by the tool and placed in /ect/PKG/scanned_applications

Just my 0.01

Reply Score: 2

whats so hard about...
by Anonymous on Wed 13th Jul 2005 15:24 UTC
Anonymous
Member since:
---

./configure --prefix=/usr
make
make install

Reply Score: 0

RE: Anonymous
by Anonymous on Wed 13th Jul 2005 16:03 UTC
Anonymous
Member since:
---

"whats so hard about...
By Anonymous (IP: 24.117.27.---) on 2005-07-13 03:24:29 PM UTC
./configure --prefix=/usr
make
make install
"

Er, try doing that with something big, that has lots of dependencies. Which in turn have other dependencies. And take hours and hours. And don't result in something easily transferrable to another system. And sometimes can't be uninstalled easily.

If ANYONE EVER thinks spending hours downloading and building many different source tarballs just to INSTALL A PIECE OF SOFTWARE is "easy", no wonder Linux has a totally minuscule share of the desktop market.

Reply Score: 0

RE[2]: Anonymous
by ma_d on Wed 13th Jul 2005 17:03 UTC in reply to "RE: Anonymous"
ma_d Member since:
2005-06-29

They don't. And that's why most distributions have a package distribution repository and a frontend to do dependency tracking. However, some people rather enjoy playing with the latest development software, or playing with a new project that distributions aren't going to distribute yet (for obvious reasons).

So they'd possibly like a nifty tool like this (thanks to the author by the way, it looks pretty cool).

The idea here is that say I write a program to display images, I'll call it "doggy-image 1.0." Of course, it's not stable, but I like big version numbers. Now I distribute just source, cause hey it's not stable anyway and I need to be developing not packaging right now. My program uses imlib2 (the latest image library from enlightenment).
http://archlinux.org/packages.php?id=4343
Now, as you can see, imlib2 has a lot of dependencies. But since several distributions, maybe even my own, has it I can just install the library then build the package.

Very few people actually track down and build everything from source. Even slackware users have linuxpackages.net, swaret, slapt-get and who knows what other utility to choose from.

Reply Score: 1

Anonymous
Member since:
---

What nonsense. Paco already exists and is better. Check paco homepage here:

http://paco.sourceforge.net/

Why is paco better? Well, for one thing, the interface for gpaco (the UI tool) is better. For another, paco uses the LD_PRELOAD method, which means that the "double-install" problem doesn't happen: paco's library replaces file system syscalls with equivalent versions that allow paco to track which files go where during installation.

I just installed paco a few days ago, and wish I had earlier. I'm probably going to "reinstall" some of my source packages just to make sure paco keeps track of them. It's by far the best way to install source packages on your machine and be able to keep track of them, their versions, and have a means for uninstalling them.

Reply Score: 0

Autopackage aims to solve these problems.
by rm6990 on Wed 13th Jul 2005 17:54 UTC
rm6990
Member since:
2005-07-04

I gave autopackage a shot, and it aims to solve these problems. I downloaded a .package file, double clicked, autopackage installed itself pretty much (had to click enter a few times) and then the terminal closed. Now that autopackage installed itself, I double clicked on the package again, 30 seconds later it installed.

Great for programs like Gaim that can be difficult to compile properly. Worked like a charm. It is also incredibly easy to uninstall stuff.

Also, if you are worried about autopackage programs conflicting with packages from ur distro, u can make it so all programs are installed in ur home directory.

Reply Score: 3

ma_d Member since:
2005-06-29

Yes, but autopackage asks developers to leave their configure script and makefile crutch behind and make their application fully relocable. This isn't always as simple as learning to build autopackages.

I'm not saying autopackage isn't cool: But it's not trying to solve the same problem. It's meant for mature applications with packagers more than well everything. Just about everything these days starts with autotools.

Reply Score: 1

Checkinstall
by Anonymous on Wed 13th Jul 2005 22:53 UTC
Anonymous
Member since:
---

I prefer checkinstall approach:

http://asic-linux.com.mx/~izto/checkinstall/

You only must do

./configure
make
checkinstall (instead of make install)

Reply Score: 0