Linked by Thom Holwerda on Thu 5th Nov 2009 23:05 UTC
Linux As we all know, Mac OS X has support for what is called 'fat binaries'. These are binaries that can carry code for for instance multiple architectures - in the case of the Mac, PowerPC and x86. Ryan Gordon was working on an implementation of fat binaries for Linux - but due to the conduct of the Linux maintainers, Gordon has halted the effort.
Thread beginning with comment 393243
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[6]: Always On the Cards
by segedunum on Fri 6th Nov 2009 14:19 UTC in reply to "RE[5]: Always On the Cards"
segedunum
Member since:
2005-07-06

Complete nonsense. You still need to compile all that stuff you put in the binary, how is that gonna help you?

That's because you have no idea what the problem actually is, as most people or even developers fannying about on forums like this don't.

The problem is not compilation and I don't know why various idiots around here keep repeating that. It never has been. The cost in time, resources and money has always been in the actual deployment. Packaging for a specific environment, testing it and supporting it for its lifetime is a damn big commitment. If you're not sure what is going to happen once it's deployed then you're not going to do it.

The only practical solution is static linking, which is what most should really do.

You lose all the obvious benefits of any kind of package, architecture or installation management system which ISVs effectively have to start writing themselves, at least in part. We're no further forward than what Loki had to cobble together years ago, and for vendors whose business does not depend on Linux it is something they will never go do. Why would they when other more popular platforms provide what they want?

In addition, it's never entirely clear what it is that you need to statically link and include in your package. You might detect installed system packages manually and then dynamically load in and then fall back to whatever you have bundled statically with your package, but the potential for divergences in that from a support point of view should be very obvious.

For example, you can download and install Opera qt4 statically compiled and it will work regardless of distribution. The same with skype, and some other closed software. So it's not impossible to do it, and yes, you need testing.

Hmmmm. I thought you were complaining about the disk space that FatELF would consume at some point.........

Anyway, just because some can do it it doesn't make it any less crap. It is hardly the road to the automated installation approach that is required.

Reply Parent Score: 2

RE[7]: Always On the Cards
by sbenitezb on Fri 6th Nov 2009 15:13 in reply to "RE[6]: Always On the Cards"
sbenitezb Member since:
2005-07-22

The problem is not compilation and I don't know why various idiots around here keep repeating that. It never has been.


Oh no, sure it isn't. You can compile for any architecture and every set of libraries for every single distro out there withing your own Ubuntu distro with just one click.. oh wait...

The cost in time, resources and money has always been in the actual deployment. Packaging for a specific environment, testing it and supporting it for its lifetime is a damn big commitment. If you're not sure what is going to happen once it's deployed then you're not going to do it.


Of course. Not only with Linux, also with Windows and OS X and it's different versions. In Linux is even more difficult because you don't know what libraries are available and which versions, etc.

"The only practical solution is static linking, which is what most should really do.

You lose all the obvious benefits of any kind of package, architecture or installation management system which ISVs effectively have to start writing themselves, at least in part. We're no further forward than what Loki had to cobble together years ago, and for vendors whose business does not depend on Linux it is something they will never go do. Why would they when other more popular platforms provide what they want?
"

There is another option, like providing your own .so files in the same package, as a catch all solution for the not so common distros.

In addition, it's never entirely clear what it is that you need to statically link and include in your package. You might detect installed system packages manually and then dynamically load in and then fall back to whatever you have bundled statically with your package, but the potential for divergences in that from a support point of view should be very obvious.


How about statically compiling those rare libraries the app may be using?

"For example, you can download and install Opera qt4 statically compiled and it will work regardless of distribution. The same with skype, and some other closed software. So it's not impossible to do it, and yes, you need testing.

Hmmmm. I thought you were complaining about the disk space that FatELF would consume at some point.........
"

In a totally different topic. FatELFs for all binaries installed is not the same as installing one or two closed source application that is statically compiled and may only add a couple more megabytes to your install.

Anyway, just because some can do it it doesn't make it any less crap. It is hardly the road to the automated installation approach that is required.


There are no automatic installation for not homogenoeous systems. This is not an Apple developed OS. The heterogeneity of Linux systems makes things difficult. There's no need to make them even harder implementing cruft that doesn't solve the problem at hand, the problem at hand being: all distro behave different.

Reply Parent Score: 2

RE[7]: Always On the Cards
by vivainio on Fri 6th Nov 2009 18:00 in reply to "RE[6]: Always On the Cards"
vivainio Member since:
2008-12-26

The cost in time, resources and money has always been in the actual deployment. Packaging for a specific environment, testing it and supporting it for its lifetime is a damn big commitment. If you're not sure what is going to happen once it's deployed then you're not going to do it.


I don't see how fat binaries would solve any of this (testing, support, ...).

Reply Parent Score: 5

RE[8]: Always On the Cards
by segedunum on Mon 9th Nov 2009 18:41 in reply to "RE[7]: Always On the Cards"
segedunum Member since:
2005-07-06

I don't see how fat binaries would solve any of this (testing, support, ...).

Because they support one installation platform that has wide distribution support. It's an absolute no brainer. They're not writing their own scripts now ar they unsure about what their dependencies are when they are troubleshooting an issue.

Reply Parent Score: 2