Linked by Thom Holwerda on Tue 20th Jun 2006 09:59 UTC, submitted by anonymous
Novell and Ximian According to a Novell confidential memo dated June 14, Novell is delaying its next release of both the server and desktop versions of SUSE Linux Enterprise 10 "to address final issues with our new package management, registration, and update system and also fix the remaining blocker defects."
Order by: Score:
Good
by davidiwharper on Tue 20th Jun 2006 10:10 UTC
davidiwharper
Member since:
2006-01-01

They can race Microsoft to see who can delay the longest :-) (*although MS has a pretty convincing lead)

Seriously, SUSE 10.1 and Ubuntu 6.06 were both rushed out the door, with less than stellar performances as a result. Better to get it right first time than to have to release SP1 a month down the road.

Reply Score: 5

RE: Good
by Thom_Holwerda on Tue 20th Jun 2006 10:20 UTC in reply to "Good"
Thom_Holwerda Member since:
2005-06-29

Better to get it right first time than to have to release SP1 a month down the road.

Interesting thread this is going to be. I'm really looking forward to all the excuses why for a Linux company it's okay, but for MS it's not ;) .

Reply Score: 1

RE[2]: Good
by Kroc on Tue 20th Jun 2006 10:37 UTC in reply to "RE: Good"
Kroc Member since:
2005-11-10

They don't try and spin webs around it. Novel have given us a reason, and one that isn't surrounded in mystery. In fact you go watch their bugzilla site and see why yourself firsthand.

And Ubunutu did the same thing only a while back. They shipped precisely on the new date as said.

And they didn't pull major features from a 'feature complete' beta product.

Reply Score: 5

RE[2]: Good
by Buffalo Soldier on Tue 20th Jun 2006 10:43 UTC in reply to "RE: Good"
Buffalo Soldier Member since:
2005-07-06

Interesting thread this is going to be. I'm really looking forward to all the excuses why for a Linux company it's okay, but for MS it's not ;) .

It's not okay for any company (FLOSS or even proprietary) to delay a release of product MANY MANY times.

It's considerable if they did it once or twice.

MS delays for so many times plus they are pulling out features. (Sorry I don't have the exact duration. Anyone remembers how many years Vista has been delayed?)

Reply Score: 5

RE[3]: Good
by mym6 on Tue 20th Jun 2006 16:59 UTC in reply to "RE[2]: Good"
mym6 Member since:
2005-08-26

Depends on how you look at it. I came across a statement by someone at MS saying after Windows 2000 was released they should/would follow a faster release cycle, much like OSX is now. If you go by that schedule, they're like...5 years late.

Reply Score: 2

RE[3]: Good
by enloop on Tue 20th Jun 2006 23:49 UTC in reply to "RE[2]: Good"
enloop Member since:
2005-11-13

>>"It's not okay for any company (FLOSS or even proprietary) to delay a release..."

Why not? They don't have to tell us anything about their development process and their release plans. They owe us nothing. They'd be justified in keeping their mouths shut until the product was ready to ship.

Reply Score: 2

RE[2]: Good
by DevL on Tue 20th Jun 2006 11:02 UTC in reply to "RE: Good"
DevL Member since:
2005-07-06

It's all about the magnitude of the delay. Vista should have been out, what, three years ago? And in its present form it has had tons of features cut.

Not that I complain though. Anything that's bad for Microsoft is good for the competition and hence for the computing world.

Reply Score: 5

RE[3]: Good
by Tom K on Tue 20th Jun 2006 17:07 UTC in reply to "RE[2]: Good"
Tom K Member since:
2005-07-06

Three years ago?

Nice attempt at a troll, but it sucks.

Reply Score: 1

RE[4]: Good
by dylansmrjones on Tue 20th Jun 2006 17:32 UTC in reply to "RE[3]: Good"
dylansmrjones Member since:
2005-10-02

Actually the only troll is you.

Vista has been kept slipping. XP and Win2K3 were only released to fill the gap. And Vista _was_ aimed for 2003 but it slipped, and it slipped and it slipped and it slipped and it slipped and it slipped and ... you get the picture, right?

Reply Score: 0

v RE[5]: Good
by slate on Tue 20th Jun 2006 17:49 UTC in reply to "RE[4]: Good"
RE[6]: Good
by dylansmrjones on Tue 20th Jun 2006 17:57 UTC in reply to "RE[5]: Good"
dylansmrjones Member since:
2005-10-02

Linux market share appears to be above 4% (at least outside USA).

Linux on the Desktop is gaining on Windows year for year, so you lose, to use your "logic".

"Cracksmoking moron"? I might be a moron, but I don't smoke crack. Anyway, calling people names is quite offensive, wouldn't you say so?

It may be XP mustn't be so bad, but could it be that most persons don't know better, and just use whatever is preinstalled on the pc? And could it be that resellers have to pay a higher price for Windows if they don't preinstall on the pc's they sell?

When Windows98 was new it took quite a market share, and it wasn't because it was the best desktop OS at that time.

And what's all that lose-thing about? I didn't know this was a competition ;)

Reply Score: 3

v RE[7]: Good
by slate on Tue 20th Jun 2006 18:53 UTC in reply to "RE[6]: Good"
RE[6]: Good
by segedunum on Tue 20th Jun 2006 18:00 UTC in reply to "RE[5]: Good"
segedunum Member since:
2005-07-06

No, you're a cracksmoking moron if you think that XP were only released to fill the gap.

There was going to be a major new version of Windows after Windows 2000 that was going to end up being Vista, or even Blackcomb (the version after Longhorn). We got XP and 2003 instead, and Microsoft was confident we would get Longhorn in 2003 or 2004. You and they lost.

Reply Score: 2

v RE[7]: Good
by slate on Tue 20th Jun 2006 18:56 UTC in reply to "RE[6]: Good"
v RE[7]: Good
by slate on Tue 20th Jun 2006 18:59 UTC in reply to "RE[6]: Good"
RE[6]: Good
by JMcCarthy on Tue 20th Jun 2006 18:24 UTC in reply to "RE[5]: Good"
JMcCarthy Member since:
2005-08-12

Lay off the koool-aid.

Longhorn was originally considered a stepping stone between XP & Blackcomb (Vienna). And it was originally given a release date of 2003.

Quit reinventing history :-)

TomK has been molested by Penguins too btw. Contact him if you need help.

Reply Score: 1

v RE[7]: Good
by slate on Tue 20th Jun 2006 19:02 UTC in reply to "RE[6]: Good"
RE[8]: Good
by JMcCarthy on Tue 20th Jun 2006 21:36 UTC in reply to "RE[6]: Good"
JMcCarthy Member since:
2005-08-12

Oh please. You and the other dorks have been circle-jerking each other for so long, you wish you could have a little kool-aid.

But it's so funny. Some kid in his basement can put together a distro with a decent package management system, but Novell fails once again.

Novell doesn't even produce anything. all they do is repackage. There's no hope for desktop linux with losers like that leading the effort.


I didn't even comment on Novell, I just corrected your revisionist view on Vista's original release date / purpose.

Seek help.

Edited 2006-06-20 21:37

Reply Score: 1

RE[3]: Good
by slate on Tue 20th Jun 2006 17:55 UTC in reply to "RE[2]: Good"
slate Member since:
2006-04-04

Linux can't compete, so it needs all the delays from Microsoft it can get

Reply Score: 1

RE[2]: Good
by Anonymous Penguin on Tue 20th Jun 2006 13:48 UTC in reply to "RE: Good"
Anonymous Penguin Member since:
2005-07-06

The main difference is, IMO, that MS has had 5 years to develop Vista, whilst SUSE releases twice a year.
Why is that? Because Vista is an attempt to square the circle, with its backwards compatibily (which is far from working all the time, BTW)

Reply Score: 1

RE[2]: Good
by abraxas on Tue 20th Jun 2006 23:35 UTC in reply to "RE: Good"
abraxas Member since:
2005-07-07

There is nothing wrong with delaying a product. There is something wrong with promising a product with X features, then delaying said product and stripping out X features, then delaying again, stripping out more features, then delaying again and stripping out even more features. This is especially bad for a company as large as Microsoft and with delays in the years. I can deal with 6 weeks or even 6 months if that's what it takes to get a product up to standards, including all originally announced features. I'm having a hard time convincing myself that Vista is worth the wait after two of the biggest features (in my opinion) are no longer going to be included. Those features being WinFS and Monad. At least it's good for competition as many Linux distributions are now far outpacing Windows with features, like XGL and Beagle.

Reply Score: 1

RE[2]: Good
by ralph on Wed 21st Jun 2006 08:17 UTC in reply to "RE: Good"
ralph Member since:
2005-07-10

*Sigh*

Thom, really, this is getting annoying and leads to me (and I'm quite sure this also applies to others) visiting this site, especially the discussion area, less and less.

I'm not even going to start to argue your point about MS and Novell. For anyone with half a brain it's plain to see that there's a hugh difference between delaying a product for some weeks, or having these incredible troubles getting out a new product that MS had.

Be that as it may (and who cares really), why is it that you as one of the editors of this site, try to start a stupid and unnecessary flamewar? Aren't the usual stupid flamewars taking place here not more than enough already?

Edited 2006-06-21 08:18

Reply Score: 0

RE[3]: Good
by Thom_Holwerda on Wed 21st Jun 2006 13:55 UTC in reply to "RE[2]: Good"
Thom_Holwerda Member since:
2005-06-29

Learn to use the internet. There's a SMILEY in the post. You know, a smiley indicates, depending on which one, humour, sadness, or whatever. In this case, it was a smiling smiley, as such, I was just JOKING.

Edited 2006-06-21 13:55

Reply Score: 1

v RE[4]: Good
by ralph on Wed 21st Jun 2006 14:22 UTC in reply to "RE[3]: Good"
RE[2]: Good
by smashIt on Tue 20th Jun 2006 11:52 UTC in reply to "Good"
smashIt Member since:
2005-07-06

They can race Microsoft to see who can delay the longest :-) (*although MS has a pretty convincing lead)


and duke nukem whenever beats them all ;)

Reply Score: 5

RE: Good
by kwanbis on Tue 20th Jun 2006 13:00 UTC in reply to "Good"
kwanbis Member since:
2005-07-06

what problems have you seen on 6.06?

Reply Score: 1

RE[2]: Good
by miscz on Tue 20th Jun 2006 17:02 UTC in reply to "RE: Good"
miscz Member since:
2005-07-17

I don't want to list my own problems with new Ubuntu (lots of them), suffice to say there was about 100mb of updates around friday and I'm still experiencing some bugs.

Reply Score: 1

RE[2]: Good
by Celerate on Thu 22nd Jun 2006 03:01 UTC in reply to "RE: Good"
Celerate Member since:
2005-06-29

In Kubuntu the control centre applets become unresponsive sometimes, I then try them in the traditional KDE control centre and they also don't work there. If I kill the related processes and then restart either one of the control centres the applet will work again until the next time it happens. This is a problem I've only ever experienced in Kubuntu, and it happened in both 6.06 and 5.10. I have not discovered the reason why it happens, but you did ask what problems there are.

Reply Score: 1

any updated packages from SUSE 10.1?
by REMF on Tue 20th Jun 2006 10:42 UTC
REMF
Member since:
2006-02-05

like:
KDE 3.5.3
Gnome 2.14.1
Compiz/XGL improvements
Cups 1.2
Inclusion of TaskJuggler as a default install App
Koffice 1.5.1
etc

or is the package list frozen from 10.1 except for critical bug-fixes?

Reply Score: 1

DirtyHarry Member since:
2006-01-31

Have you ever heared of additional repositories?

Official updates from SUSE are bugfix/security fix *only*. But with additional repositories added you can upgrade to for example KDE 3.5.3 and Koffice 1.5.1.

I advise to use Smart for this, because of the broken package management in SUSE 10.1.

Download Smart from GURU, do a

smart update
smart upgrade

And there you go! The latest greatest!!!

Regards Harry

Reply Score: 5

elsewhere Member since:
2005-07-13

or is the package list frozen from 10.1 except for critical bug-fixes?

It's frozen, SLED/SLES and Suse 10.1 are using a common base.

This is strictly about getting that monofied bastard child of the former Suse and Ximian package management systems working, since it's the same system Suse Linux 10.1 users have been wrestling with. Patches were released for 10.1, but all they did was elevate package management/updating from being totally unuseable to being somewhat absolutely horrible to use.

In fairness to the Suse devs, they never tried to deny their was a problem with the new system and have worked like gangbusters to fix it, but the underlying understanding was that this was inflicted by Novell after version freeze specifically to burn in and test the new system for their enterprise offering.

Now that it's blocked their enterprise release, there should be a "drop everything and get it working now" attitude at Novell that should finally fix things for opensuse users as well.

I really like Suse and continue to use it, but am extremely disappointed by the way the pm system was mishandled, there are politics involved beyond any drive to streamline and improve efficieny of the subsystem. The new system looks good on paper and certainly adds benefits, but when it became apparent early on just how broken the implementation was it should have been shelved for the next version release.

I'm not entirely confident Novell will get this fixed in time for the gold master. This could be a big boost for the Smart Package Manager, as that could be the only way paying enterprise customers are able to manage updates on their systems... ;)

Reply Score: 5

Flatline Member since:
2006-03-06

"I really like Suse and continue to use it, but am extremely disappointed by the way the pm system was mishandled, there are politics involved beyond any drive to streamline and improve efficieny of the subsystem. The new system looks good on paper and certainly adds benefits, but when it became apparent early on just how broken the implementation was it should have been shelved for the next version release."

That is exactly the way I feel about it, and is the reason that I wanted Novell to delay OpenSUSE. Releasing a (heretofore excellent) distribution with a broken package manager is patently unacceptable in my opinion, especially considering that they ALREADY HAD a working package manager in prior releases.

Reply Score: 2

narcissus Member since:
2005-07-06

Yeah I have to totally agree. I installed 10.1 on a desktop and a laptop. I love it EXCEPT for the new package management. Oh my, what a mess. I have two other computers to upgrade, but not until that disease has been cured.

Reply Score: 1

REMF
Member since:
2006-02-05

not an acceptable solution.

there is no reason for SUSE have not to used the time between 10.1 and 10.0E to update some core packages with minor updates.

they may have chosen not to, which is their choice fair enough, but adding stuff from repo's is not what i asked about and certainly not what i want.

cheers ;)

R3MF

Reply Score: 1

dylansmrjones Member since:
2005-10-02

Creating releases are not about being bleeding edge, but about being stable.

Rather a bit old releases and working, than bleeding edge and not working.

Reply Score: 5

john Member since:
2005-11-10

there is no reason for SUSE have not to used the time between 10.1 and 10.0E to update some core packages with minor updates.

Actually, there is a very real reason: A single codebase.

It's been Novell's intention at least since 10.0 to base the next enterprise release (SLED10/SLES10, or 10.E as you call it) on the 10.1 codebase. That's why 10.1 got the semibroken package management so late in the development cycle, because they wanted it there for the upcoming enterprise releases.

So, no version changes.

Reply Score: 1

My guess
by Shaman on Tue 20th Jun 2006 11:29 UTC
Shaman
Member since:
2005-11-15

There have been so many significant packages released in the past month. GCC 4.1.1 is ready for general use and can dramatically improve C++ software in particular, new GLibC, KDE, KOffice, et al.

I suspect - but may be wrong - that they want to recompile the whole code base in the new compiler against the latest GLibC. If they're smart, they will.

Reply Score: 1

RE: My guess
by Amaranth on Tue 20th Jun 2006 11:45 UTC in reply to "My guess"
Amaranth Member since:
2005-06-29

That would be the stupidest thing they could do. You don't switch to a new gcc and glibc a couple months from release and try to compile the entire distro. They're probably delaying to try to make Xgl suck less.

Reply Score: 5

RE[2]: My guess
by Shaman on Tue 20th Jun 2006 12:11 UTC in reply to "RE: My guess"
Shaman Member since:
2005-11-15

>That would be the stupidest thing they could do. You
>don't switch to a new gcc and glibc a couple months from
>release and try to compile the entire distro.

Oh really? Why have Debian, Gentoo and others done it, then? Debian has been using 4.0x in Testing and Unstable for a long time now, and 4.1.x is a much better compiler from most accounts - certainly for Java and C++. Debian Unstable has gone to 4.1.x now. Some Debian Testing packages have been showing up compiled with 4.1.2-prerelease.

Moving from 4.0x to 4.1x isn't a big leap and brings in the performance for most apps that 4.0x lacked. It brings stability, better compiled code and lower memory usage for C++ apps.

One might expect that they want to get kernel 2.6.17 into the distribution, now, too. Since most modern dists install the root file system with EXT2/3, a performance improvement of up to 50% might be of interest?

This isn't cutting-edge stuff. It's modern. Only the latest kernel and GLibC in my list could be considered cutting-edge, IMHO. The rest are just the current evolved and matured revisions.

Edited 2006-06-20 12:14

Reply Score: 0

RE[3]: My guess
by Organic_Info on Tue 20th Jun 2006 12:22 UTC in reply to "RE[2]: My guess"
Organic_Info Member since:
2006-02-28

Are any of those distro's you just listed being touted as enterprise class software and sold with a 5 year support promise? Errrr No.

SLES and RHEL need to be stable, funny enough most enterprise could not give a stuff about the latest packages because above all they value stability and certification of thirdparty software.

And that last point is also why there will be no last minute changes of the magnitude you suggest, thirdparty software support can make or break an enterprise platform.

OpenSUSE 10.1 suffered because of important last minute changes, Novell will not want to piss of customers with a shoddy and unreliable release of SLES 10.

You should think about the business implications of what you suggest.

Oh and while you talk about Debian UNSTABLE, as Debian STABLE (3.1) was released June 2005 if the previous release schedule is anything to go by then GCC, Glibc and the other packages won't go mainstream 2008 when their stability is assured.

Edited 2006-06-20 12:36

Reply Score: 3

RE[4]: My guess
by segedunum on Tue 20th Jun 2006 17:43 UTC in reply to "RE[3]: My guess"
segedunum Member since:
2005-07-06

Are any of those distro's you just listed being touted as enterprise class software and sold with a 5 year support promise? Errrr No.

The GCC in there now is hardly enterprise class and 4.0 was a bit of a step backwards. In reality, upgrading the compiler and toolchain would be a fairly low risk fow quite a few new benefits. However, if the release was perhaps a few months further away it would have been more realistic.

OpenSUSE 10.1 suffered because of important last minute changes

Different thing. The last minute changes to 10.1 were the package management system, and it was absolutely pointless.

Novell will not want to piss of customers with a shoddy and unreliable release of SLES 10.

Oh don't worry. They've been making a great job of that for some time across all of their products.

Edited 2006-06-20 17:48

Reply Score: 1

RE[3]: My guess
by SlackerJack on Tue 20th Jun 2006 12:52 UTC in reply to "RE[2]: My guess"
SlackerJack Member since:
2005-11-12

where did you get this 50% number from?, SUSE use reiserfs by default by the way.

Reply Score: 1

RE[3]: My guess
by ratatask on Tue 20th Jun 2006 19:05 UTC in reply to "RE[2]: My guess"
ratatask Member since:
2006-01-28

Gentoo is.. gentoo, so it has the obvious excuse for doing such things.
Debian as you mention does no such thing on Debian stable. testing and unstable is ... just that.

Reply Score: 1

RE[2]: My guess
by lagitus on Tue 20th Jun 2006 13:20 UTC in reply to "RE: My guess"
lagitus Member since:
2005-07-18

Including Xgl in a serious distro as anything other than an optional expermiental toy during 2006 is bound to end in disaster. Xgl/compiz crawls on quite a few systems, fails to start on many and crashes as often as win95 on most. Last time I tried it, it also multitasked horribly when the CPU was busy.

Novell, please, for once let us have a truly properly tested and stable Linux distro. I had high hopes for Ubuntu 6.06 but alas...

Reply Score: 3

RE[3]: My guess
by SlackerJack on Tue 20th Jun 2006 13:51 UTC in reply to "RE[2]: My guess"
SlackerJack Member since:
2005-11-12

Xgl is more than just a toy, sounds to me you have not tried the latest because alot of issues are gone now. CPU load is minimal and i'd rather have the GPU take the load rather than the CPU.

Reply Score: 2

RE[4]: My guess
by slate on Tue 20th Jun 2006 17:42 UTC in reply to "RE[3]: My guess"
slate Member since:
2006-04-04

We're not talking about kiddies playing around with eye candy. This is suppose to be for the enterprise. Xgl is way far away from being stable enough for that. Plus, Xgl is really just a hack.

Reply Score: 1

RE[4]: My guess
by segedunum on Tue 20th Jun 2006 17:55 UTC in reply to "RE[3]: My guess"
segedunum Member since:
2005-07-06

Xgl is more than just a toy, sounds to me you have not tried the latest because alot of issues are gone now.

On graphics cards like nVidia it tends to work OK with some strange problems here and there that need to be tracked down and sorted out. On other graphics cards the quality can vary wildly. It will take some time to make it stable on a widespread basis, if ever.

Reply Score: 1

RE[3]: My guess
by Sphinx on Tue 20th Jun 2006 16:20 UTC in reply to "RE[2]: My guess"
Sphinx Member since:
2005-07-09

Must be compiz, use it with the XFCE's internal damage control instead of compiz and it's rock steady.

Reply Score: 1

RE[3]: My guess
by JMcCarthy on Tue 20th Jun 2006 18:28 UTC in reply to "RE[2]: My guess"
JMcCarthy Member since:
2005-08-12

Xgl/Compiz has never given me any trouble. At all.

I would never have it turned on by default with someone where reputation / first exprience mattered though. Just in case.

Reply Score: 1

RE[2]: My guess
by segedunum on Tue 20th Jun 2006 17:53 UTC in reply to "RE: My guess"
segedunum Member since:
2005-07-06

They're probably delaying to try to make Xgl suck less.

XGL is the absolute rock bottom least of their worries. They need a working package manager, build up Zenworks and make sure that the technology they build all that stuff up with is complete and stable enough at the same time. No mean feat.

Reply Score: 1

Delayed?
by thebluesgnr on Tue 20th Jun 2006 12:08 UTC
thebluesgnr
Member since:
2005-11-14

According to the article it's supposed to come out on July 22. Novell always said it would come out in the Summer, so I don't see where the problem is.

Of course, had they promised it for 2003 and we were still waiting then I'd see your point. ;)

Reply Score: 2

SUSE problem
by hraq on Tue 20th Jun 2006 13:14 UTC
hraq
Member since:
2005-07-06

I have been testing suse since v 7 and I never liked any of the released versions of it. One simple reason for that: quality is not a concern with this company; all their software were rushed to market while in beta though they mark it as gold release.

I know that there are tons of packages on SUSE DVD but what should I do them if they are unstable and buggy, and x server crash from time to time.

I have tried all GUI interfaces to discover the same problem to persist:on KDE, GNOME, All window makers ... and other exotic ones.

This move from Suse might be the first turn towards looking for quality rather than features of marketing reasons! I wish them to wake to solidify linux appeal.

Reply Score: 2

Releases
by Bobmeister on Tue 20th Jun 2006 13:31 UTC
Bobmeister
Member since:
2005-07-06

HI all...

Interesting...but I think that it's the right approach....this is an enterprise release..it HAS to be right or they will completely blow their reputation.

I have been VERY HAPPY with all of the SUSE releases since 9.1 as I guess I have been lucky NOT to have all of the "problems" that a lot of other people have...maybe I don't exploit the distro's as much as I tend to be somewhat conservative....but I do compile apps and use the librarires/etc. a lot and don't seem to have trouble.

But I think that it has to be understood...that the release philosophy of SUSE vs. SUSE ENTERPRISE is going to be different. In my view, the early release of the standard community version of SUSE is fine being a little buggy....as it's pretty much experimental anyway...much like Fedora Core is. I'm not bothered with this philosophy. The release philosophy of Enterprise MUST be different...as it must work right out of the box damn near perfectly....I personally think that they are trying to do this. Much like Red Hat Enterprise version....older, more tested software...but stable as all heck (my only experience is Cent and Scientific, but I consider these pretty much the same as RHEL).

So...the last comment about XGL I agree with....I personally think that it's pretty much stupid to advertise EARLY BETA software as a "feature" and they would be well advised to tone that part down a bit if they release that with SUSE Enterprise..as it's too unstable for enterprise use anyway. This was dumb letting that cat out of the bad so early.

OK..that's it for now...thanks for letting me chime in

Reply Score: 2

me too
by deanlinkous on Tue 20th Jun 2006 14:36 UTC
deanlinkous
Member since:
2006-06-19

hey what can I say, they took leaps and bounds and made a really cool product....now they need to go back and clean up the details. At least the big work has been hammered out now they need to go tweak everything. Sounds reasonable.

Ok, yea they should of done this before rolling with it...

Reply Score: 1

Problem with Novell
by Don T. Bothers on Tue 20th Jun 2006 15:00 UTC
Don T. Bothers
Member since:
2006-03-15

Linux distributions work well only through collaboration. I have not seen Novell do that much lately. Rather than using/enhancing yum or apt, they wrote something different. Rather than using/enhancing SELinux, they wrote something different. Rather than using/enhancing gcj, they pushed Mono. Rather working with the Xorg committee, they went and wrote XGL from scratch. The problem with Novell is that they are trying to do too much all by themselves. They just need to start working with the community again.

Reply Score: 4

RE: Problem with Novell
by deanlinkous on Tue 20th Jun 2006 17:20 UTC in reply to "Problem with Novell"
deanlinkous Member since:
2006-06-19

might have a point!
I would like to see more "togetherness" from a lot of distros and players in the linux field but I am afraid it may never happen. I cant fault JUST novell on this and at least they kicked some ass with all the stuff you mentioned and got something accomplished and done in a short period of time. But if wishes were horses I would like to see more "working together" instead of so much "working apart" going on nowadays.

Reply Score: 1

RE[2]: Problem with Novell
by calberto on Tue 20th Jun 2006 17:52 UTC in reply to "Problem with Novell"
calberto Member since:
2006-01-30

You are forgetting that the problems are attacked from different approaches. When Red Hat created rpm and then yum, you could have argued that they didn't cooperate with apt and instead created a new tool. But it's a matter of design and all that, and it's no as easy as you say to cooperate.

Keeping your line, we could say that Gnome folks didn't cooperate with Kde, and you could also say that Gentoo people didn't cooperate with the Debian world, and the examples continue.

It's very easy to say that a group or another isn't cooperating. But the fact is, you don't know the reasons behind it.

Plase, take a deeper look before saying things like these.

(For example, Novell didn't cooperate with gcj and instead created Mono, because a huge number of reasons that have been debated here long enough).

Carlos.

Reply Score: 1

RE[3]: Problem with Novell
by danieldk on Tue 20th Jun 2006 21:40 UTC in reply to "RE[2]: Problem with Novell"
danieldk Member since:
2005-11-18

You are forgetting that the problems are attacked from different approaches. When Red Hat created rpm and then yum, you could have argued that they didn't cooperate with apt and instead created a new tool. But it's a matter of design and all that, and it's no as easy as you say to cooperate.

Yum is was not written by Red Hat. It was written by some good folks at Duke University, and was based on the Yellow Dog updater (hence its name: Yellow dog Updater, Modified).

It's very easy to say that a group or another isn't cooperating. But the fact is, you don't know the reasons behind it.

That's a crucial point. E.g. AppArmor has been around for a while, and was developed by Immunix. Novell bought the assets of Immunix, including AppArmor.

And in the end it's not too bad. A good Mandotary Access Control framework or X server is not something that you write in two weeks, or fork and maintain without sufficient manpower. So, we'll end up with two major MAC frameworks, giving room for some fair competition (and choice).

Reply Score: 2

RE[3]: Problem with Novell
by Don T. Bothers on Wed 21st Jun 2006 00:32 UTC in reply to "RE[2]: Problem with Novell"
Don T. Bothers Member since:
2006-03-15

"When Red Hat created rpm and then yum, you could have argued that they didn't cooperate with apt and instead created a new tool."

You just made the point for me. RPM was created by RedHat while Yum (YellowDog Update Manager) was not. RPM was also a small project that was much needed and one that Red Hat chose to take on, and one that everyone now shares in maintaining.

"Keeping your line, we could say that Gnome folks didn't cooperate with Kde"

But again, you are comparing the Gnome community to the Novell company. Gnome is a community project and has contributors from IBM, Sun, Red Hat, and many, many more companies and individual open source contributors.
Competition among the open source community is not a bad thing. However, one company trying to compete against the entire open source community is going to be difficult regardless of whether or not they keep their source code open or closed.

"Novell didn't cooperate with gcj and instead created Mono, because a huge number of reasons that have been debated here long enough"

And again, Novell has taken full responsibility in getting the bloated mess to work with a limited number of their employees and a complete lack of community. On the other hand, while RedHat is fairly involved in developing gcj, they are just merely one company out of an entire community of developers working to fix it.

The reason for the success of open source projects is not due to open source itself. It is the collaboration and mutual work of an entire community in sharing the burden of development and then reaping the benefits. If the community does not get involved, it does not really matter whether the project is open source or closed source.

Reply Score: 1

RE: Problem with Novell
by segedunum on Tue 20th Jun 2006 18:04 UTC in reply to "Problem with Novell"
segedunum Member since:
2005-07-06

The problem with Novell is that they are trying to do too much all by themselves. They just need to start working with the community again.

You have a fairly major point there. Novell are looking like the proprietary company they always were by trying to take software in house and work on it there. The problem is the nature of open source software is very ill suited to this, and they'll end up working on things mostly by themselves that give them no payback for their effort at all.

Reply Score: 1

RE: Problem with Novell
by abraxas on Tue 20th Jun 2006 23:58 UTC in reply to "Problem with Novell"
abraxas Member since:
2005-07-07

Linux distributions work well only through collaboration. I have not seen Novell do that much lately. Rather than using/enhancing yum or apt, they wrote something different. Rather than using/enhancing SELinux, they wrote something different. Rather than using/enhancing gcj, they pushed Mono. Rather working with the Xorg committee, they went and wrote XGL from scratch. The problem with Novell is that they are trying to do too much all by themselves. They just need to start working with the community again.

Novell has actually been better to the community than most commercial distros. When they acquired Suse they GPLed Yast. They did a great job with XGL and released a much better (IMO) solution than AIGLX. XGL had been in the works for a long time and no one really contributed so I give praise to Novell for continuing its development. The best part about XGL is that it is much more graphics card agnostic than AIGLX. As for GCJ and Mono, Suse bought Ximian, the developers of Mono, and continued with its development because Mono is a good technology and it fits in with Novell's attempts to be compaitible with Microsoft. Mono applications tend to fit in better than Java apps on a Linux distro anyway. The only point you made that I halfway agree with is SELinux. I would like to see SELinux mainstream but AppArmour is a simpler and easier to implement security technology. In the end the decision to develop AppArmour might have been a good one considering the work that must be done to SELinux to make it viable on widely varying setups.

Reply Score: 1

RE[2]: Problem with Novell
by miguel on Wed 21st Jun 2006 18:45 UTC in reply to "RE: Problem with Novell"
miguel Member since:
2005-07-27


The only point you made that I halfway agree with is SELinux. I would like to see SELinux mainstream but AppArmour is a simpler and easier to implement security technology. In the end the decision to develop AppArmour might have been a good one considering the work that must be done to SELinux to make it viable on widely varying setups.


AppArmor was a technology developed by a third party (Immunix) that used the same kernel infrastructure that SELinux did.

As you point out, AppArmor is simpler to use for most people and its simpler to create arbitrary sandboxes for applications, it does not require a lot of expertise to create new profiles.

When Novell bought AppArmor, they open sourced the technology which before was proprietary.

Reply Score: 1

Novell
by poohgee on Tue 20th Jun 2006 15:13 UTC
poohgee
Member since:
2005-08-13

With lots of things here there is community on one side & business attitude ,interests & politics etc on the other .

But IMO good they are fixing things up because zen is still not completly happy on my system.

Just IMO ;)

Reply Score: 1

RE[4]: My guess
by Shaman on Tue 20th Jun 2006 15:33 UTC
Shaman
Member since:
2005-11-15

>any of those distro's you just listed being touted
>as enterprise class software and sold with a 5 year
>support promise? Errrr No.

So you advocate instead that they go with older packages of all the software, despite the fact that the feature sets and bug fixes of the latest packages such as KOffice tend to be considerable in comparison with previous revisions? You sound like the maintainers of Debian Stable, which make the OS obsolete before they are even stabilized.

I don't advocate going bleeding-edge with *.0 releases of things, mind you. Hey, isn't XGL not even a full x. release?

>And that last point is also why there will be no
>last minute changes of the magnitude you suggest,
>thirdparty software support can make or break an
>enterprise platform.

3rd-party software almost always ships with its own libraries or with statically compiled binaries. Distributions generally frown on the practice anyway.

>You should think about the business implications of
>what you suggest.

I've been using 4.x and 4.1.x compilers for some time now. I've considered what I am suggesting quite a bit. The latest GLibC and GCC are extremely valuable contributions to the future of Linux, as are the latest versions of KDE and KOffice, if that is your favoured environment. If Novell isn't planning that far ahead, I'll be concerned for them, especially given the focus on Java and the improvements in GCJ/AWT.

>Oh and while you talk about Debian UNSTABLE, as
>Debian STABLE (3.1) was released June 2005 if the
>previous release schedule is anything to go by then
>GCC, Glibc and the other packages won't go >mainstream 2008 when their stability is assured.

Debian is probably the most conservative distribution around. Yet their Testing distribution is using 4.1.x for new binaries and they have 4.1.2-pre available already in Testing. I strongly suspect that they will deprecate 4.0 in days. Talk to most Debian users and they will tell you that Testing is the distribution they use, because Stable is obsolete. Unstable is a gamble, but they have already standardized on GCC 4.1.x and the latest GLibC (Testing is at GLibC 2.3.6 still).

In Gentoo, my issues with the new (to Gentoo) compiler have been next to zero, and the benefits of memory usage, software startup, etc. have been noticable compared to the 3.4.x toolset they were using before. Give it a shot with some of your code, if you haven't.

Edited 2006-06-20 15:38

Reply Score: 1

RE[5]: My guess
by Organic_Info on Tue 20th Jun 2006 17:48 UTC in reply to "RE[4]: My guess"
Organic_Info Member since:
2006-02-28

"So you advocate instead that they go with older packages of all the software"

Look lets clear things up a bit. I don't disagree that some of the packages will be a bit old when SLES10 finally ships. Assuming the SLES 10 is derived from OpenSUSE 10.1 then the packages are what 2-6 months old when they froze the packages for Release Candidate testing.

Lets face it when SLES10 ships it will take most 3rd party vendors 3 months to announce certification and support for SLES10. A lot of places (who favour support and stability) will not introduce SLES10 until their middleware vendor (SLES 3rd party vendors) announces that certification......which means by the time SLES10 can be integrated/upgraded in an existing environment the packages will be quite old.

So from all that I agree with you. Right or wrong that is their preferred test and release schedule to which they attempt stability.

The point you raised which I don't agree with is that your suggested changes can not be made this close to the release date. It would invalidate a lot of testing that has taken place not only by Novell but ISV's. As stable as the latest gcc/glibc/packages may be Novell can not take the risk of releasing an unstable SLES.

Enterprises are slow moving risk averse places and they are who Novell (and Red Hat) are targeting.

The main reason the company I work for use SLES9 is because we MUST use a certified platform for support from our middleware supplier. No certified platform == no support, and that is NOT an option.

So old packages are not preferred, but stable is the name of the game with SLES.

"In Gentoo,...."
I like Gentoo, but most 3rd party software vendors don't like fast changing platform - its a support nightmare. And thats why SLES and RHEL are the preferred distro's for most businesses (suppliers and users). I/we don't have to like it but thats the way it is.

Reply Score: 1

RE[5]: My guess
by segedunum on Tue 20th Jun 2006 17:49 UTC in reply to "RE[4]: My guess"
segedunum Member since:
2005-07-06

So you advocate instead that they go with older packages of all the software, despite the fact that the feature sets and bug fixes of the latest packages such as KOffice tend to be considerable in comparison with previous revisions? You sound like the maintainers of Debian Stable, which make the OS obsolete before they are even stabilized.

I suppose it's a balance thing, but people always equate older with more stable. In the open source world this isn't really the case as more and more bugs get fixed in later versions and improvements are made, especially between minor versions. Debian Stable isn't actually more stable or more bug free in terms of the software it uses at all, it's just more bugs tend to be known about.

Reply Score: 1

one big advantage
by DirtyHarry on Tue 20th Jun 2006 17:12 UTC
DirtyHarry
Member since:
2006-01-31

There's one big advantage in the whole "packagemanagement sucks in 10.1" issue:

Smart get's a huge boost! Personally I believe that Smart (especially considering it's age) can become a major player in the package management issues accross distributions. That fact that in can handle (and does, beatifully!) many different repo types is really very cool!

SUSE 10.1 rocks, it really, really rocks, if you forget the pm issue and install Smart.

Reply Score: 1

RE[6]: My guess
by Shaman on Tue 20th Jun 2006 18:27 UTC
Shaman
Member since:
2005-11-15

>Assuming the SLES 10 is derived from OpenSUSE 10.1
>then the packages are what 2-6 months old when they
>froze the packages for Release Candidate testing.

Therein lies the problem, in my mind. Linux is all about pushing the boundaries of computing and providing features in a linear series of updates, rather than setting a feature freeze and catching up later. What they should be doing is setting a list of software they want to include, the package manager they want to include, and some extra value-added features that sets their distribution apart.

But freezing code at obsolete revisions is pointless, to my thinking. When the OS ships, it should ship with the latest non-point-zero revisions of the software minus any packages that are *known* to cause problems (or packages that would cause a major distribution upheaval in terms of configuration).

I'm OK with them not shipping GLibC 2.4.0 but not shipping GCC 4.1.1 would be a big mistake. Linux is also largely about development, and 4.1.1 has features that cannot be overlooked - stack protection, C++ performance, dramatic GCJ improvements, much better non-x86 architecture optimization support, et al.

Anyway, the other side of that argument is something I'm not blind to as well. If they want SUSE to be known as the distribution that is stable at the cost of feature set and are not concerned with known bugs which exist in obsolete software, then so be it. They won't have me as a customer, and I guess that isn't a concern if that's where they are headed. Remember, as well, that you don't have to ship binaries compiled with the newer compiler in order to ship the compiler.

>So from all that I agree with you. Right or wrong
>that is their preferred test and release schedule to
>which they attempt stability.

I understand, and I can see that point of view. I just hoped for better from Novell, especially given some of the more important projects they have funded.

>As stable as the latest gcc/glibc/packages may be
>Novell can not take the risk of releasing an
>unstable SLES.

Maybe. While I think otherwise, I'm not going to debate this point, since I don't know what sort of facilities they have for testing. But... Debian and Gentoo have done the testing and released 4.1.x for some time now (in Gentoo 4.1.1 is now the ~arch compiler of choice).

>Enterprises are slow moving risk averse places and
>they are who Novell (and Red Hat) are targeting.

Understood, but somehow I doubt the enterprises will be using applications that are 100% GPL. If I'm wrong, I am entirely happy to admit ignorance!

>The main reason the company I work for use SLES9 is
>because we MUST use a certified platform for support
>from our middleware supplier. No certified platform
>== no support, and that is NOT an option.

I feel your pain. We have a RH9 system that we have to support, and it's a nightmare. Luckily there is a project out there to support RH9 with bugfixed packages, but RedHat has abandoned its children. RedHat is a menace to the Linux way of things, IMHO - they have done to customers exactly what people went to Linux to escape - planned obsolescence.

> So old packages are not preferred, but stable is
> the name of the game with SLES.

I suspect this will get them left behind in enterprises with IT staff that are really on the ball. But time will tell, and perhaps my perspective isn't as good as yours - I'm used to dealing with the latest/greatest and cleaning up as I go, but I realise not everyone has the same drive or the same time budget.

Reply Score: 1

RE[7]: My guess
by DrillSgt on Tue 20th Jun 2006 19:13 UTC in reply to "RE[6]: My guess"
DrillSgt Member since:
2005-12-02

"I'm OK with them not shipping GLibC 2.4.0 but not shipping GCC 4.1.1 would be a big mistake. Linux is also largely about development, and 4.1.1 has features that cannot be overlooked - stack protection, C++ performance, dramatic GCJ improvements, much better non-x86 architecture optimization support, et al."

In a way yes. However I know for a fact with the industry I am in, Simulation Software, we require the use of GCC 3.4 due to high profile applications that use that, and do NOT work with GCC 4. We still use Suse 9.3 for that very reason, as it is much easier then setting the machine up to use multiple GCC versions. Until the other software catches up, we are stuck down here in order to do what we do.

Reply Score: 2

RE[7]: My guess
by danieldk on Tue 20th Jun 2006 20:21 UTC in reply to "RE[6]: My guess"
danieldk Member since:
2005-11-18

Therein lies the problem, in my mind. Linux is all about pushing the boundaries of computing and providing features in a linear series of updates, rather than setting a feature freeze and catching up later.

It just doesn't cut it. When you deploy an enterprise application you want:

1. The APIs to be stable.
2. The system to be supported for many years.

And that is what RHEL and SLES are all about: stable APIs and long release/support cycles. Maybe it doesn't fit for you, but when it comes to supporting customers it is ideal. You can focus on two platforms that are not moving, and that's it.

I'm OK with them not shipping GLibC 2.4.0 but not shipping GCC 4.1.1 would be a big mistake. Linux is also largely about development, and 4.1.1 has features that cannot be overlooked - stack protection, C++ performance, dramatic GCJ improvements, much better non-x86 architecture optimization support, et al.

Quite often new features are already backported to the stable enterprise version. E.g. RHEL4 has stack protection techniques integrated in gcc and glibc. And sometimes new features are backported (especially in the kernel) if it does not result in API and ABI changes.

Maybe. While I think otherwise, I'm not going to debate this point, since I don't know what sort of facilities they have for testing. But... Debian and Gentoo have done the testing and released 4.1.x for some time now (in Gentoo 4.1.1 is now the ~arch compiler of choice).

I can't comment on Gentoo, I have never used it. But Debian Testing is not Debian Stable. And Debian usually does long freezes before renaming the current testing branch to "stable". So, the fact that it is Debian Testing now is not really relevant. In the Red Hat and SUSE scheme of things it is like saying gcc so and so is in Fedora Core 6 Alpha or SUSE Linux 10.2 Alpha.

I feel your pain. We have a RH9 system that we have to support, and it's a nightmare. Luckily there is a project out there to support RH9 with bugfixed packages, but RedHat has abandoned its children. RedHat is a menace to the Linux way of things, IMHO - they have done to customers exactly what people went to Linux to escape - planned obsolescence.

Red Hat 9 is not an enterprise version. If your company used RHEL3 (which was originally based on RH9) they would still get updates through their subscription, and would continue to get them until October 2010. Or if you don't want to pay: CentOS 3 will do the same thing.

Reply Score: 3

RE[8]: My guess
by Shaman on Wed 21st Jun 2006 01:09 UTC in reply to "RE[7]: My guess"
Shaman Member since:
2005-11-15

> 1. The APIs to be stable.

The = key in dselect will solve that for Debian. ;)

Rarely will a kernel or a C compiler change that unless your application needs to directly address the kernel, but C++ etc. with their runtime libraries certainly can. Luckily, Linux will, like most *nixen, support multiple versions of a library and problems can be solved that way. Or the apps can be statically compiled; which contrary to popular thought is *not* a performance/memory loss in every case.

GLibC can be an issue.

Ironically, one of GCC 4.x biggest features is that it is much more strict and complete in terms of supporting the C++ ABI/specs. Yet another reason to recommend it, since the runtime libraries are unlikely to greatly change from this point onwards.

>Quite often new features are already backported to the
>stable enterprise version. E.g. RHEL4 has stack
>protection techniques integrated in gcc and glibc.

Yes, I have seen that in many cases. I think it's kind of ugly to do that, but what the hey...

> But Debian Testing is not Debian Stable.

Most people who use Debian are using Testing, because Stable is obsolete. Not, mind, as obsolete as the old Stable was. If your application is vertical in nature and requires Stable, more power to you.

> Red Hat 9 is not an enterprise version.

Tell that to the application vendor. In this case, they also require kernel 2.4, which doesn't run worth a fiddler's damn on current-generation machines (no or little support for many new chipsets).

Overall, I guess what I was trying to say is that I hope Novell is progressive with their new distribution instead of becoming another RedHat, with planned obsolescence and a required forklift upgrade (or a nail-biting upgrade process) between major revisions rather than continuous improvement. Ubuntu was supposed to provide us with that, but my experience is that stability of Dapper is dismal even in comparison with my Gentoo boxen running unstable arch profiles.

Edited 2006-06-21 01:12

Reply Score: 1

RE[7]: My guess
by segedunum on Tue 20th Jun 2006 21:47 UTC in reply to "RE[6]: My guess"
segedunum Member since:
2005-07-06

Luckily there is a project out there to support RH9 with bugfixed packages, but RedHat has abandoned its children. RedHat is a menace to the Linux way of things, IMHO - they have done to customers exactly what people went to Linux to escape - planned obsolescence.

You're spot on there.

Reply Score: 1

RE[6]: My guess
by Shaman on Tue 20th Jun 2006 18:28 UTC
Shaman
Member since:
2005-11-15

>people always equate older with more stable. In the
>open source world this isn't really the case as more
>and more bugs get fixed in later versions and
>improvements are made

You've made the thrust of my argument clearer than I could. Thanks for that.

Reply Score: 1

Even Linux server adoption is slowing
by NotParker on Tue 20th Jun 2006 19:03 UTC
NotParker
Member since:
2006-06-01

2005 - Windows Server Unit Sales up 12.9%
2005 - Linux Server Unit Sales up 14.3%

Linux used to grow at 50% or more, then it was 30%, now its just above Windows.

2006 it will be half that of Windows. Cherry picking Unix on x86 upgrades only works for so long.

(Yeah yeah, your servers are bought without an OS and you downloaded Linux later. Me too. 200+ Dell Servers with no OS. We buy our Windows Server OS from a reseller because we get an Academic discount).

Reply Score: 1

But in a production environment...
by IanSVT on Tue 20th Jun 2006 20:38 UTC
IanSVT
Member since:
2005-07-06

Working in a Novell environment, I just want them to "get it right". I don't care about bleeding edge technology. I want to be able to get actual work done on my Novell technology based network. If it takes six more months, no worries here, as long as it's high quality.

Edited 2006-06-20 20:39

Reply Score: 1

Ban
by Thom_Holwerda on Tue 20th Jun 2006 21:26 UTC
Thom_Holwerda
Member since:
2005-06-29

Ok Mr Slate, that's been enough insults for now. Now, where's that drop-down ban menu?

Reply Score: 1