Linked by Thom Holwerda on Tue 26th Feb 2008 20:59 UTC, submitted by Oliver
FreeBSD "FreeBSD is back to its incredible performance and now can take advantage of multi-core/CPUs systems very well... So well that some benchmarks on both Intel and AMD systems showed release 7.0 being faster than Linux 2.6 when running PostreSQL or MySQL. Federico Biancuzzi interviewed two dozen developers to discuss all the cool details of FreeBSD 7.0: networking and SMP performance, SCTP support, the new IPSEC stack, virtualization, monitoring frameworks, ports, storage limits and a new journaling facility, what changed in the accounting file format, jemalloc(), ULE, and more."
Thread beginning with comment 302627
To read all comments associated with this story, please click here.
Binary system/package capabilities...
by dindin on Wed 27th Feb 2008 15:52 UTC
dindin
Member since:
2006-03-29

Has the binary package management system changed that much?

1) Can the system be updated/upgraded with binary only packages like "apt-get dist-upgrade"

2) Can applications be upgraded with binary only packages?

The lst time I used FreeBSD the binary packge system and availability of binary packes was the killer for me. I would install a binary package but I could not upgrade via binary packages due to unavailability.

FreeBSD needs to ensure that binary packages are treated as first class citizens as source ports else this is going to be an on going issue.

Reply Score: 2

tankist Member since:
2007-01-19

You can point variable PACKAGESITE to the stable (instead of release) packages repository and upgrade packages by running

portupgrade -aPP

Ports still will contain newer stuff though.

Reply Parent Score: 1

Oliver Member since:
2006-07-15

>FreeBSD needs to ensure that binary packages are treated as first class citizens as source ports else this is going to be an on going issue.

Why - because lower quality is better? C'mon it's of course nice to have something like apt-get/aptitude in Debian, but in the end you have to live with sometimes broken packages or packages compiled without essential features ("mature" maintainer). Therefore people like the sourcecode as base and compiling from sourcecode isn't anything alien to a UNIX or free UNIX derivative.

Reply Parent Score: 2

Doc Pain Member since:
2006-10-08

Therefore people like the sourcecode as base and compiling from sourcecode isn't anything alien to a UNIX or free UNIX derivative.


One other reason for this circumstance is the fact that certain programs need to be compiled if options are to be set at compile time. For example, in order to put all the features and codecs you want into mplayer, you usually edit Makefile.local and add your stuff. There is no binary package for every imaginable combination of make options (e. g. WITH_SDL, WITH_VORBIS, WITHOUT_RUNTIME_CPUDETECTION). The same reason applies if you need to install software on lower end hardware (e. g. CFLAGS+= -O3 -pipe -mfpmath=sse -ffast-math). Often, compile time options give you the "speed boost" you need in order to make the application usable. Another important reason is to get hardware working the way you want, if the developers consider your hardware "nonstandard" or the behaviour you're expecting as "not usual" (e. g. support of three button mouse in X: middle mouse button + vertical movement = wheel emulation, but middle mouse button click = middle mouse button click without entering wheel emulation mode - this requires patching mouse.c from X.org and recompilation).

Reply Parent Score: 2

dagw Member since:
2005-07-06

Why - because lower quality is better?


*BSD and gentoo people keep trying to claim that compling from source on the machine that will run the binary is somehow 'better', yet I've never seen any solid argument as to why this should be so.

Therefore people like the sourcecode as base and compiling from sourcecode isn't anything alien to a UNIX or free UNIX derivative.


Compiling from source takes a lot of CPU cycles and memory. Doing so on a heavyly loaded server is generally a bad idea. Being able to update your server without having to crunch away for hours is probably something you want.

Reply Parent Score: 2

dindin Member since:
2006-03-29

> "Why - because lower quality is better?"

How do you say this is lower quality? All Linux/Windows/OS X applications are not source based buy yet some how they seems to have made their users productive without having to wait 8 hours for the damn OpenOffice distro to compile. Even if I compile from source its not like I am goign to get 50% better performance, even if I set all the options before compiling, I might get 1-3% improvement. I'll take the binary package any day for that.

> "C'mon it's of course nice to have something like apt-get/aptitude in Debian, but in the end you have to live with sometimes broken packages or packages compiled without essential features ("mature" maintainer). "

I am not sure what Linux distribution you have used but I have never experienced any package issues with apt-get. For that matter, I have encountered upteen broken ports. Ever experience compiling and you come back the next morning to find a compile error and then you start all over again.

Don't even get me started on mixing ports and packages. Have enough experience into "package X require version a.b.c or package Y but version a.b.d is installed" stuff. Eventually have to create symb links to get it to work.

> "Therefore people like the sourcecode as base and compiling from sourcecode isn't anything alien to a UNIX or free UNIX derivative."

Yes. I can see it clearly now. Ubuntu started about 10+ years after FreeBSD and garnered thousands of users many time over simply catering to all those people who wanted a non-optimized lower quality system that would be slow as hell and short on features and very unstable.

Reply Parent Score: 1