Linked by Bob Marr on Thu 10th Jun 2004 05:48 UTC
Linux Consider these memory requirements for Fedora Core 2, as specified by Red Hat: Minimum for graphical: 192MB and Recommended for graphical: 256MB Does that sound any alarm bells with you? 192MB minimum? I've been running Linux for five years (and am a huge supporter), and have plenty of experience with Windows, Mac OS X and others. And those numbers are shocking -- severely so. No other general-purpose OS in existence has such high requirements. Linux is getting very fat.
Order by: Score:

Funnily, I did try Fedora Core 2 on a 128 MB Linux-certified machine (that was before I upgraded it to 384 MBs recently). I knew that FC2 required 192 MB minimum for graphical, but I didn't want to nuke my FC1 on my other machine that has 512 MB of RAM, so I decided to give it a quick TEST shot on that Duron machine with 128 MB. The result:

FC2 *was unusable*. And I mean, *unusable* with either KDE or Gnome. Things would load ages later or wouldn't load at all. I could only *kinda* use FC2 at 128 MB when I switched to XFce.

On the same machine, with 128 MB memory, I also tried Xandros, Mandrake 10, Arch Linux and Linare Linux. From the bunch Mandrake was the one that was "a bit" heavy, but all in all, the machine remained usable (NOT confortable by any means, but usable if you wanted to do a quick job with it). But FC2 was really not usable at 128MB, and because that fact gave me a glimpse of what's coming soon I actually decided to upgrade that machine (I didn't have any incentive to upgrade that machine before, it is not my primary machine, I just use it for some tests).

New X might help
by Ashleigh Gordon on Thu 10th Jun 2004 05:56 UTC

Now that X is being developed again things might improve, the weakest part of linux has always been the GUI and X. KDE and Gnome have added a lot of polish over the years and now look really slick, but this comes at a performance cost, which improvements to X might fix.

Still, RAM isn't exactly expensive anymore and running WinXP on less than 256mb RAM is pretty bad too.

It's about choice
by Daniel de Kok on Thu 10th Jun 2004 05:58 UTC

It is not all that bad, it is just about choice. For example, I installed Libranet (which is quite user-friendly) on 128MB and 64MB 400MHz machines. With IceWM and Opera that works quite well, and it isn't really more difficult to use than e.g. Win9x.

Yep, I agree that KDE and Gnome are bloated these days.

by therandthem on Thu 10th Jun 2004 05:59 UTC

BeOS wasn't cool for no good reason. BeOS can make 6 year old hardware feel fast.

What is the solution for Linux? Copy everything that BeOS does. Run the legacy kernel on top of the L4 micro kernel. Have all the desktop features use the micro kernel and multi-threading directly.

Just a thought. Oh, and before you say it, micro kernels will make a difference in this case. Why? Look at BeOS driver management. Drag and drop.

hmm... Kinda true.
by Josh on Thu 10th Jun 2004 06:00 UTC

With ever progression the requirements for any os goes up. Gnome however, is known to be slow right now compared to the latest KDE, so Id expect that. I wonder if the author has Swap on as it works really well. My experience is the new 2.6 kernel isnt freindly with older hardware compared to the 2.4 series, so things like mandrake 10 id install 2.4, which is what I in fact did on a relatives pc and it ran pretty good. It was a pII with 400mhz. the Killer here is the RAM, as long as you have like 192 your good, hell even 128 will do. Its like my freind who had a celeron 500 mhz. it ran windows 2000 slower than the pII, my guess is exactly cause it only had 64 mb ram.!

Anywheres people installing oses generally aren't joe six pack, its usually their geek freind or the office tech so installing vector linux or any low end thing should be fine. There is used ram out there now a days on ebay since the market atificially inflates SDRAM prices when they technically should be worth dirt by now. Actually if one has a pII id tell them to buy a new comp for 500-600 dollars. They can get an athlon XP cheap as well as a graphic card for a good price.

What a silly rant.
by Anonymous on Thu 10th Jun 2004 06:04 UTC

Running XP with less than 256MB of RAM (if you're going to do more than play solitaire) is a disk-thrashing nightmare. 512MB is comfortable.

Microsoft recommends 128MB minimum and claims that it'll run, albeit badly, with a mere 64. I'd rather use an abacas than try that. The point is, 256MB is really not so much considering XP was released in 2001 and FC2 was released in the middle of 2004, YEARS LATER. Why should an OS that has been evolving over the last three years (since XP was released) be expected to conform to the system requirements of an OS that's been out for years? Following that logic, the XP system requirements (which dwarfed 98's requirements, and came out roughly as far apart as XP and FC2) were just as horrible and alarming as this person seems to think Fedora's are.

I'm forced to wonder if this guy was clutching his Pentium 200 to his chest and screaming, "It's not fair! It's just not FAIR!" when XP came out ... I mean, that P200 Classic would run 98 just FINE, how DARE Microsoft make 300Mhz the recommended spec for XP!

Or maybe I just don't get it.

Keep things in perspective.
by Mark on Thu 10th Jun 2004 06:05 UTC

Linux is not getting fat. Fedora, or any other distro with those requirements are. Keep the word Linux in context with the kernel and we are a lot less troubled. If you choose to run KDE/GNOME2 and then add GDM, and all the bells and whistles (gdesklets for example)... expect to use some ram up.

Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users -- the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don't pay any attention to speed, and it's giving a lot of people a very bad impression.

I still don't understand what the giant obsession with pleasing everyone is. I think there is a distinction in making a distro for 'newcomers', and just making a user-friendly distribution. Just because Mandrake is easy to install doesn't mean it is made for newcomers. Windows isn't the Microsoft OS for newcomers, it's just damn easy to use period. I think the same flies for easy-to-run Linux.

RE: What a silly rant.
by Eugenia on Thu 10th Jun 2004 06:08 UTC

>Running XP with less than 256MB of RAM is a disk-thrashing nightmare

I am running XP with 256 MB of RAM daily. That's my PRIMARY machine, a dual Celeron 533 and 256 MB RAM running XP PRO.

I run IE, OE, Winamp, Notepad and Trillian at the same time with no problems at all. Things only get a bit strictier when I need to use an IDE or PaintShopPro, but overall, 98% of the time I only use the 5 apps mentioned above, and 256 MBs are more than enough to run those. At least for my needs, it runs great at 256 MB.

by opa on Thu 10th Jun 2004 06:08 UTC

Something they has needed to be said for awhile. Look, I love GNOME, but on my eMac with Debian, it's slower and takes more memory than Mac OS X while doing less. That is pretty lame and I definately hear you on GNOME Terminal, it is VERY slow. I see the GNU/Linux niche being in performing well on low-end systems, with benefit to third world countries and budget conscious companies, but at the current rate it just isn't happening. Windows is *so* more responsive than GNU/Linux at the moment and I don't care whether the blame lies with X or GTK or the language, all I care about is that it is, and so does every other consumer. Look at an old NeXT system, or look at the Contiki OS and how much they do with so little, it's embarrassing. Yes, times have changed and the level of complexity has increased, but people shouldn't need a 1.5GHz system with 512MB RAM to decently browse the web, e-mail, type up letters and listen to music.

Linux speed
by Sami on Thu 10th Jun 2004 06:12 UTC

I did try to run RedHat 8.0 on my Toshiba 64mb 433Mhz Celeron laptop, and it was totally unusable. I was n00b back then (well, I´m still a newbie, but atleast I know the basics), I thought after reading comments on Linux how good it was compared to Windows, that RH would run on my laptop. That experiment brought me back to reality, so to speak.

I didn´t switch back to Windows tho. I installed Debian with Xfce, and it runs like a charm. My 3Ghz 1G Ram P4 box is running Mandrake 10.0. For my friend, new to Linux, needed OS for his old AMD K6-2 box which had Win98 before until it got totally trashed, I offered old RedHat 6.2, which runs quite good. Atleast if our hardware can´t run the latest Linux, we can get old versions and still use our comps.

But, Linux apps could use better coding, I got apps running who get totally slow even on P4!

Let me get this straight
by df on Thu 10th Jun 2004 06:15 UTC

Fedora has some steep requirements, and suddenly the "Linux platform is getting fat?"

Guess I'm hallucinating the memory footprint of my Gentoo installation.

Even at 3Ghz 1 gig of ram
by Lumbergh on Thu 10th Jun 2004 06:19 UTC

...which is almost my specs for my dual-boot system you start noticing how much slower Gnome 2.6 is than XP Pro.

When I first got this sytem back in January I was sitting in linux most of the time just for the mere fact that just to get everything up and running on gentoo, including wireless, and both desktop just the way you want it is a part-time week in itself. I thought Gnome looked good. The fonts were alright after I spent some time tweaking.

Well, for the past month and a half or so I've been in my XP Pro partition mostly working in Eclipse. Once I enabled ClearType things looked about a 1000% better in windows and eclipse just tends to run better in windows, plus with Firebird what the hell.

The other night I decided to play this very old BladeRunner DVD that windows media player just doesn't handle for whatever reason, but I knew that a program I had for gentoo would handle.

Well, I hadn't been in linux for quite a while and man I just didn't like what I saw. The fonts just look like crap compared to me be using to cleartype and Gnome 2.6 (even on a P4-3.2 ghz, 1 gig of ram, and a ATI 9600 Pro card) just seemed sluggish compared to windows.

Yeah, yeah, I know I can run fluxbox or whatever, but why should I. With a firewall, a router, Firefox I'm not getting viruses. I know how to keep my system clean so why should I even mess with Linux.

A 256MB RAM dimm is the smallest I can get today
by Andrew on Thu 10th Jun 2004 06:19 UTC

and it costs less than $40.

So, your point is FC2 won run on your old Duron box. OK, that's a point. It won't, and I think the Fedora people intended it that way.

Then you claim "Linux is getting fat" ...

Linux is a kernel. It runs on machines with 2MB of RAM quite well, depending on kernel version.

GNU/Linux is an OS. It runs on anything from an embedded system with 2MB of RAM to an IBM 390 with GBs of core.

This post is a troll...

RE: RE: What a silly rant.
by marshall on Thu 10th Jun 2004 06:20 UTC

I agree 256MB is realistic

Have a look on any major PC manufacturer's website. Until recently (I haven't looked in a while but maybe even still now) most laptops and pcs were coming with 256MB RAM as standard. Tho in my experience having a standard set of apps (Office suite, mail program and a browser) on that setup will begin to swap like hell on Windows after a couple of weeks of daily use.
Linux it seems will stay at the same level of performance without degrading over time at least but XP on 256 to begin with it fine.

things in perspective
by simon on Thu 10th Jun 2004 06:28 UTC


you can't keep things on the kernel: normal users don't know what a kernel is! They're pointing at their screen and say "This is my Linux." or "This is my Windows."

They don't care what makes it slow.

And normal users don't like to hack to speed up any OS. And I agree that Linux need better coding and usability, some examples:

When I copy a text on OpenOffice via the context menu why can't I insert in Mozilla Composer via the context menu?

Why don't I have a universal installer service? I don't like to bother about 24 libraries that are missing.

I use Mandrake 8.2 for my webserver @home and I tried to upgrade to Mandrake 9.2...but the KDE 3 was so buggy that I returned to Mandrake 8.2 with KDE 2...

by Jeremy Friesner on Thu 10th Jun 2004 06:31 UTC

This seems like a good place to ask -- my company might be distributing our new show control app as part of a custom Linux install CD. We'd like to have a Linux distro that (a) can be installed by someone who doesn't know a thing about Linux, other than "put the CD in the drawer, reboot, click Next until it's done", (b) auto-recognizes all reasonably recent (<5 years old hardware) and auto-configures it (including networking), and (c) Runs as snappy as possible -- an ugly fast GUI would be preferable to a pretty, sluggish API. (Our customers previously ran our app under BeOS, and they put a premium on responsiveness) It would also be nice (but not strictly required) if it had the capability to run directly from the CD, and if it didn't install a bunch of esoteric extra stuff that won't be needed. Any recommendations regarding distros to try for this?

Re: Let me get this straight
by Richard S on Thu 10th Jun 2004 06:32 UTC

Dude, Gentoo Linux running KDE 3.2.2, and Mozilla are slow as fsck too on 128MB RAM. Do you really dare to deny that?

I know, because I use Gentoo on a 366MHz box with 160MB 'o RAM. I would not DARE to run KDE on it. I use IceWM instead, but it's still painstakinly to run FireFox and and aMSN together.

However, the blame isn't just Linux. It's the apps. KDE,, Firefox...just to name a few. They are terribly huge. But no one likes to optimize for free, so you won't see that changing.

You can say a lot about Microsoft, but Office and MSIE start pretty damn fast and use less RAM than their OpenSource counterparts. Unless you like to compare lynx to MSIE and Abiword to MS Office, ofcourse.

Just for the record, I have just booted into KDE, and started only aMSN, Firefox, konsole and kdict. Memory footprint:

774680 TOTAL
263724 USED
124584 CACHED

That's 260MB used already.

Install Slackware and change Life
by Enrico on Thu 10th Jun 2004 06:33 UTC

At that time i ran Mandrake 7.1 on a PIII 450MHz 64 MB. I switched to Slakware (8.0 or 8.1). It was another world. Try Slackware. It's another way. I haven't yet tried Gentoo or Debian. I'm still on Slackware. (Though i changed to a P4 1700MHz 256 MB).

I also tried BEOS R5.. I feel Impressed, It's fast fast fast (but W98 is fast too on a P4). It's great. But it's dead, more or less, and Zeta it's something strange. I think they don't have BeOS code, so the hack and hack here and there without the possibility to really improve the kernel code etc.
Another problem with BeOS. It's not multiuser. Sadly.

Office suites
by Daniel de Kok on Thu 10th Jun 2004 06:39 UTC

BTW. On the office front TextMaker and Planmaker are good light (commercial) alternatives for OpenOffice.

RE: What a silly rant.
by johnny from UBC on Thu 10th Jun 2004 06:39 UTC

>Running XP with less than 256MB of RAM is a disk-thrashing nightmare

what are u talking about...
I had a Celeron 400 with 128 mb RAM
and Win XP PRO and Office XP was usable on that machine. Not fast true, but not unbearablly slow.

Now I upgrade the machine to Celeron 533 + 256mb.
It's actually pretty fast (not fast enough to play game, but fast enough for IE+winamp+chat program).

Another point
by Lumbergh on Thu 10th Jun 2004 06:40 UTC

Things will probably continue to get worse as far as desktop bloat goes when you consider that say your primarily a Gnome user, but like to use that one KDE app. Well by using that one KDE app you're probably bringing in 3/4 of the KDE desktop libraries as well.

I guess that's the price you pay for the "freedom to choose".

Bottom line is...
by Rodrigo on Thu 10th Jun 2004 06:44 UTC

...You can't eat the cake and have it.

If ones expect the system to have all the bells and whistles, beautiful interface with lots of themes and decorations, everything plug and play, support to all type of multimedia formats etc, one should expect higher requirements.

RE: Bottom line is...
by Jon on Thu 10th Jun 2004 06:48 UTC

I agree. However, XP and Windows 2003 Server does that better than Fedora/SuSE/Mandrake in terms of memory requirements and CPU needed. So, there will always be some comparison going on.

by Anonymous on Thu 10th Jun 2004 06:48 UTC

This article is right, apps for Linux are gettin slower and slower.. lets take my Notebook for example - Sony Vaio 450Mhz PIII with 320 MB of Ram. I had Mandrake installed before - was cool but sloooow - so I've switched to Gentoo - fast and liht very kewl.

I use it for Mail, web browsing, and working (I code web apps). And I can't find a descent editor which does 3 things: code highlighting, tabs for multiple documents, and customizable shortcuts the way I want.

I've tried them all, to name few: gedit, screem, quanta, bluefish, eclipse, kate, anjuta... and you know what? Except for Kate which I use now they all were unusable slow when editing file with about 1000-2000 lines of code - after pressing enter I had to wait like a 10-15 seconds for the editor to become usable again - and when you code you press enter quite often.

To spice things up I can say that running EditPlus with Wine was faster then using apps I menshioned above and this is very pathethic...

Why applications nativly written for Linux run slower then program running in emulator?
Bad code? Bad design? I don't know but this indeed is alarming...

Not happen to me...
by Noer on Thu 10th Jun 2004 06:49 UTC

I use Slack 9.1 in my celeron 434 MHz with 256MB RAM. It was installed with Slack 9.1 default Gnome 2.4 and X(still)Free 4.3. It used to use around 150MB in RAM right after the whole system loaded up with X and Gnome.

Surprisingly, after upgrading to Dropline Gnome 2.6 (with X.Org bundled), the RAM used is decreased to about 90MB. I noticed that either the previous XFree 4.3 and X.Org takes 11MB in RAM. Everything feels fast and faster than previous versions. Bloat ? Not happen to me...

@Andrew, @Mark
by Foo Bar on Thu 10th Jun 2004 06:53 UTC

Linux is a kernel. It runs on machines with 2MB of RAM quite well, depending on kernel version.

GNU/Linux is an OS. It runs on anything from an embedded system with 2MB of RAM to an IBM 390 with GBs of core.

This post is a troll...

Uh, no, Andrew/Mark. You're barking up the wrong tree. If you exclude the Linux kernel from what's "fat", then you also have to exclude the Windows XP kernel. It's pretty small, too.

The things that make an OS fat are the things that users interact with most commonly: Shells, apps, etc. Not the kernel. But, regardless of how you want to characterize "Linux", it is judged by what's included by default when you setup a distribution. People don't install Linux and say, "Wow, I really like the speed of the kernel -- but KDE really blows chunks perf-wise." They blame the entire stack because (a) most don't know what a kernel is, (b) even if they did, they don't have visibility into the kernel to differentiate between bloat there and the apps.

the reason i don't care....
by some guy on Thu 10th Jun 2004 06:56 UTC

Computers are getting pretty fat too!

RE: A 256MB RAM dimm is the smallest I can get today
by Jon on Thu 10th Jun 2004 06:58 UTC

>Linux is a kernel. It runs on machines with 2MB
>of RAM quite well, depending on kernel version.

The title of the article is about the "Linux platform", meaning the desktop and surrounded apps, NOT just the kernel.

I wish Linux supporters stop using the same argument over and over when someone says something negative about the *platform* and they happen to use the word "Linux" simply because it is generic enough and convienient. We all know what the author meant, so there was no reason for the trivia.

by Foo Bar on Thu 10th Jun 2004 06:59 UTC

Why applications nativly written for Linux run slower then program running in emulator?
Bad code? Bad design? I don't know but this indeed is alarming...

There's an old rule-of-thumb that, given a set of resources (CPU, GPU, memory, FPU, I/O devices, etc), applications will grow to consume all possible resources. This is so incredibly true. It is our nature (as human beings) to never be satisfied with what we have -- and add more features. Over time, Linux is going to continue to bloat and leave old hardware behind. This isn't a bad thing, in itself. There's a price to be paid for progress. But none of us should have the unrealistic expectation of being able to load Linux upgrade-after-upgrade on the same hardware year-after-year and expect that the perf will be the same or better. Just doesn't happen. Software developers get used to setting new hardware baselines, just as politicians get used to setting new tax baselines. It's inherent.

I don't believe you...
by J.F. on Thu 10th Jun 2004 07:00 UTC

I am running XP with 256 MB of RAM daily. That's my PRIMARY machine, a dual Celeron 533 and 256 MB RAM running XP PRO.

I run IE, OE, Winamp, Notepad and Trillian at the same time with no problems at all. Things only get a bit strictier when I need to use an IDE or PaintShopPro, but overall, 98% of the time I only use the 5 apps mentioned above, and 256 MBs are more than enough to run those. At least for my needs, it runs great at 256 MB.

I don't believe that for a moment. I run XP Pro on an Opteron with 512M. If I'm running more than one program, it can take as much as a minute just to flip windows between programs. From my experience, XP needs at least 1G of RAM to run comfortably with multiple programs.

I'm not talking monster programs either. I'm talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.

The article is just FUD, and so are some of the responses. Lets hear a little truth for a change instead of blind astroturfing.

Some perspective here ...
by JH on Thu 10th Jun 2004 07:03 UTC

I agree in general with this article, but it's a bit overblown. Why not just say "no real desktop OS runs the latest apps without at least 256 mb RAM" and be done with it? Some perspectives:

- I recently watched the latest Knoppix fail to even start KDE on a new 2.4 gHz Dell Dimension with 128 mb RAM (since with Knoppix there's no swap). That's annoying.

- XP or OS X *will* run on 128 RAM ... but they churn horribly because they're dependent on swap/virtual memory in that case too. For our clients (nearly all Windows shops) we insist on 512 mb minimum for all new XP desktops. We've been doing this for at least the last year. The increased productivity is more than worth the measly $50 in RAM!

- I run the latest Suse 9.1 quite comfortably on a Pentium II 400 mHz with 384 mb RAM. The same machine ran Win2K quite snappily as well. But take away the RAM and it probably wouldn't boot KDE either.

- And we run dev servers on the latest Mandrake and Trustix distributions ... running LAMP + Samba + Postfix + a few others only requires about 90-100 megs of RAM, leaving plenty of room on a 128 mb machine for multiple httpd processes and a PHP bytecode shared memory cache like mmcache or php accelerator. These are Pentium-class machines and they respond nicely ... let's see Windows Server 2003 even try to boot on one of those!

Are Gnome and KDE bloated? Well, haven't they always been? And consider that most of the big distros like Suse are basically running them both all the time, because most people want both KDE and GTK apps ... and they're separate from the window server, which is separate from the kernel and so on. All of which are cross-platform code. Compare that to XP or a hobby OS like BeOS or SkyOS which vertically integrates all those components and is written for a specific chip architecture (x86 only) and of course a Linux desktop is going to be more bloated.

But if I can run the latest desktop distro on a 6-year-old Pentium II with 384 mb RAM, who cares?

RE: I don't believe you...
by Jon on Thu 10th Jun 2004 07:04 UTC

> I run XP Pro on an Opteron with 512M. If I'm running more
>than one program, it can take as much as a minute just to
>flip windows between programs. From my experience, XP needs
>at least 1G of RAM to run comfortably with multiple programs.

I don't believe all what you say, not one bit. I use XP on a 256 MB machine as well as many others on an Athlon-XP 1.3 GHz and it runs great. Either your installation is hosed, or you blatantly lie.

by Nice on Thu 10th Jun 2004 07:06 UTC

Recently having to give up my big boxes, I was forced to recover my old K6 200mhz box with 60MB RAM (and 4MB integrated graphics yuck!) from the garden shed. Now I only have a w2k license so I installed that. Then, as I was beginning to install Mandrake the CD drive died. The Mandrake install CD must still be in there.. Maybe it was a hardware detection probe, maybe it just died through neglect.

So w2k with 60MB RAM and 200mhz - quite dog slow! I think I have to consider myself lucky that I didn't install Mandrake before I lost my CD!

Re: Indeed
by Anonymous on Thu 10th Jun 2004 07:09 UTC

Give VIM a try, it's likely to suit your needs.

Re: Bottom line is...
by Vanders on Thu 10th Jun 2004 07:09 UTC

If ones expect the system to have all the bells and whistles, beautiful interface with lots of themes and decorations, everything plug and play, support to all type of multimedia formats etc, one should expect higher requirements.

I'd have to disagree. The author mentioned Syllable, among others, so I'll pick up on it now. Syllable can boot from power-on to login window in around 16 seconds, even on machines as slow as E.g. an AMD K6 233. It is usable in 64Mb (Which is a lot but we're hoping to actually bring that number down in future) A typical Syllable system is running the appserver, the Media server, the Registrar and the Dock. With a setup like this, you can play media from WAVs to XVid MPEG-4 video.

Syllable has low overhead because we've tried to make it that way. We're mindful of increasing memory usage or anything that might slow the computer down. We don't introduce large dependency trees which require tens of additional libraries or applications to be loaded to support another application (which is quite possibly Linux's biggest problem) I fail to see why modern Linux distributions can't do the same things.

RE: golly
by Daniel de Kok on Thu 10th Jun 2004 07:10 UTC

Let met guess, it is a LG CD-ROM drive? The Mandrake hardware probe in 9.2 kills LG drives due to some LG firmware bug. This can be fixed, just search with google, afaik LG released a firmware update that solves this...

RE: Recommendations?
by Devon on Thu 10th Jun 2004 07:17 UTC

There are a number of small fast distros that boot right off a cd with great hardware detection and install easily, some of them suprisingly small and light! Search on or Google and you'll find them. I should warn you though, you likly won't find a perfect fit, and may have to roll your own based on an existing one.

The article has a point
by NTWS01 on Thu 10th Jun 2004 07:17 UTC

I know I'm going to get martyred for this but the article is right, sure its not Linux itself but the include apps that are the problem. As much as I like Linux I will not deny that ever since I started using it I kept wondering where I was supposed to find all that extra speed everyone was talking about.

In my case I've found KDE 3 to start apps faster then Windows XP home but in KDE I use Konqueror and KMail and in Windows I use the Mozilla suite which is heavier (more features) then Konqueror and KMail and therefore an exception can be made for the extra few second it takes to load.

I can't say anything for Microsoft office because its been a long time since I've used it but Corel WordPerfect Office in Windows has a lot more features then and starts up faster (albeit OO.o is also available on Windows, Linux and MacOS).

I know how miserable it is to try and install any Linux distribution (except for the antiquated Debian woody) on old hardware never mind run it because the minimum requirements have been increasing so fast, using an old distribution isn't always an option because those don't meet the software requirements for running new apps any more so the only two options now are either to use source based distributions or buy a new computer every two years.

IMO Linux will survive for a long time to come because of the $0 price tag on most distributions and the free developer tools and KDE of course but something does need to be done to resolve the minimum requirements issue or the next free OS with free developer tools that comes around is going to outperform Linux and get all its users.

I get the impression that a lot of the people who commented either didn't bother to read the entire article or didn't bother to read it at all, there is mention of source based Linux distributions being a possible solution but as the article said how is a newbie supposed to manage installing a distribution like Gentoo (yes newbies do end up having to do their own installs, they don't all have a seasoned Linux veteran to turn to).

Its getting to the point now where it would be more worth people's time to buy a used copy of Windows 95/98 off eBay and use that with free tools like Zone alarm, Grisoft AVG and Spybot Search & Destroy rather then use one of the latest Linux distributions even if a lot of them are free.

I'm not talking monster programs either. I'm talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.

My xp box is just fine with the apps you mentioned except azureus. Azureus is my favourite bt app but bogged down on either platform due to java.

But this article hits what I've found right on the head.

My box is a 1.6 ghz but only 128 megs of ram and most things run like shit to put it bluntly. I'm running deb unstable and using XFCE4. XFCE4 is light but things seem to pile up quickly. I switched to xfce4 from icewm as I wanted to try and have a uniform desktop environment using mainly gtk2 apps.

Before I used to just go for speed exclusively, but ended up with a mishmash of apps with all these different toolkits and looking ugly as sin and interoperabilty issues with copy/paste etc. So I went gtk2 and things aren't much better. Mozilla, and a few other apps open and it's not responsive at all. And as one other pointed out, you can't seem to get it all with one toolkit, so I use k3b instead of gtk2 offerings and that brings with it alot of kde's bloat.

My gf says to me everytime she uses linux that 'LINUX IS SLOOOOWWWW". I respond with I'll tweak it but can never seem to get good performance from this box. I compile my own kernel with just the bare minimum things I need for this hardware platform. I'm on the latest 2.4 series, I've tried 2.6 series several times and always run into swap issues with 2.6. I don't think 2.6 handles minimal ran too well at all. I've exchanged several emails with andrew mortan on the swap issue but nothing resolved so far.

I would be happy if most developers went into a feature freeze for 6 months and just optimize the shit out of their apps. Maybe not the most exciting thing for a programmer and might make linux look a little dated on some fronts but I think it would be worth the effort. Besides the next windows has been delayed for awhile yet so there is a good window of opportunity.

Think about it, 3 big selling points for linux (ignoring open source of course) was speed,stability, security.

XP gives people the speed and now the stability that previous versions of windows didn't have and microsoft is heavily working on security with the next version. And we can laugh off windows and security, but they don't stop on something until they have it. They might be slow as hell getting there but they will approach a much higher level of security than they have today all the while still providing their EASE OF USE that is sorely lacking in some areas of linux. And don't give me the crap it's just what people are initially used to. Cause with millions of people out there the desktop is what counts and if they have to go to a xterm once, you've failed.

Look, I'm not a windows fanboy. Far from it. I actually would like to see them whither away but I gotta call it like I see it.

Re: I don't believe you...
by NanoBaka on Thu 10th Jun 2004 07:22 UTC

I don't know what exactly is your definition of "comfortable." I have a pIII 700mhz laptop 128mb ram, XP Pro runs with acceptable speed. This laptop also has a Slackware 9.1 installed (with a 2.6.6 kernel) and it's quite a bit slower. It is still usable, just noticeably slower. Have a couple of Firefox windows open along with a konsole (or gnome terminal) and I'll see lots of disk swapping.

KDE does deserve some credit because the upgrade to 3.2 makes things a lot faster although still eats up a lot of RAM. But Firefox is getting annoying. The Windows version is acceptable but it's pretty slow on Linux.

May be you should try running XP and Linux on a low end machine first before saying people are spreading FUD just because they have a different experience.

(Btw, if your Opteron with 512mb RAM takes a minute just to flip between windows, may be you should check whether there's something wrong with your hardware or your XP installation. Even my XP PRO on my laptop can do better than that.)

by Smurf on Thu 10th Jun 2004 07:22 UTC

I use Mepis on a P3700, 384mb ram, 32mb Viper AGP card. Dual boot mepis - Win2000.256mb memory is the MIN. in todays world, if you dont want to add a little memory, stay with what you have. On this machine Mepis runs at 99% the speed of 2000 in loading programs, stability, etc. When I had a Permedia2 8mb AGP video card, 2000 ran great, Mepis ran pretty slow. ALL of the distros I have tried in the last 3 years have ran much better with a better video card. Built in sucks, PCI was much better, but the AGP slot speed everything up in both 2000 and Mepis. As for bloat, 2000+ WordPerfect + MediaPlayer9 + dbPoweramp + ZoneAlarm + AVG AntiVirus + AdAware = 3.74gig on my HDD. Mepis, which includes everything I need, 1.93gig. To me Win2000 is bloated, and a security mess to boot. I'll take my bloated Mepis anyday. If people want to switch the will just have to learn, just like they did when the started using windows.

by Twiztid015 on Thu 10th Jun 2004 07:24 UTC

Actually I would Have to disagree with this article. I guess Bob Marr has never tried VectorLinux. it is VERY Lightwieght. It doesnt require 128+ Memory. People have used vector on machines with less than 128 and used gnome/kde on it just fine. It may seem wierd that its based on slackware and still is EASY to use. Check it out sometime. Http:// It even boots far fater than anydistro that I have tried, and I have tried more than my share.

@J.F. (XP performance)
by Gawron on Thu 10th Jun 2004 07:26 UTC

Sorry, saying that XP Pro needs at least 1G is simply not true. I was forced to use it on an old notebook (Thinkpad TP600) with 128MB and 233PII processor. It was not super fast, but definitely usable - with such applications as MS Office, Outlook etc. On my current machine (PIII 900 notebook with 256 MB) XP performance is better (the difference is not astounding but noticeable) than Fedora Core 2 (for the same applications - for example Firefox) - while running KDE, Gnome is much slower. And XP Pro is way faster at booting - 3-4 times faster in fact.

I must agree with the article author - but I also think that not only the speed is becoming a problem - general quality of applications is getting worse (perhaps because the apps are getting more complex - gone are days of simple, text-mode only apps not depending on complex libraries). The OS kernel is probably still more stable than - say - XP kernel, but I would say that the entire GNU/Linux OS (as perceived by a user - including desktop environment, applications etc.) is much less stable (speaking about "standard" distributions such as Fedora, Mandrake, Suse etc.) than Windows XP. And this is *very* frightening...

Pointless ranting
by jbmadsen on Thu 10th Jun 2004 07:27 UTC

Blah blah blah, lots of anecdotal evidence which doesn't amount to anything.

Programs do more than they did X years ago. They require resources to do so. So if you want to run some program today at the same speed the same program ran X years ago, you need more resources (this obviously only holds for mature programs).

No, you can't run the latest and greatest with all the fancy stuff on your ten year old Pentium 90MHz with 24 MB RAM. The latest and greatest will always require more resources than what some people have.

Try interpolating between the requirements for Windows XP and Longhorn and then plot those requirements listed for FC2 on the same graph. I don't think it looks even unreasonable.

This deserves....
by John Blink on Thu 10th Jun 2004 07:27 UTC be slashdotted and OSNEWed (200 post by tomorrow morning), and Google NEwed ;)

This is exactly the problem with the Linux desktop, developers mean well, but I don't feel they have considered how many CPU cycles there programs take. Maybe it is also a syncronization problem in their program too.

by Julia Partens on Thu 10th Jun 2004 07:27 UTC

Bob Marr wrote: "Why should a 1 GHz box with Fedora be so much slower than a 7 MHz Amiga? Sure, the PC does more - a lot more - but not over 1000 times more (taking into account RAM and HD power too). It doesn't make you 1000 times more productive."

Couldn't agree more. Plus, you could buy three books for the Amiga platform and knew everything about it. Free software movement needs to organize, this is getting nowhere.

Anyone with solutions for advanced users at least?
by Anonymous on Thu 10th Jun 2004 07:27 UTC

Any tips people have for optimizing linux (and I mean the platform and all that entails not the kernel).
I don't think saying go the gentoo way because I was excited about that but I've read many comparions where a gentoo distro was any faster than many other popular distros.

So any tweaks anyone knows of post em.

RE: I don't believe you...
by J.F. on Thu 10th Jun 2004 07:28 UTC

I don't believe all what you say, not one bit. I use XP on a 256 MB machine as well as many others on an Athlon-XP 1.3 GHz and it runs great. Either your installation is hosed, or you blatantly lie.

Typical response of the astroturfer. "You must be doing something wrong." It's not Windows, it the user. Your other choice is just downright insulting. If you can't defend the product, attack the consumer. The system is as described, it is properly installed, and I don't blatantly lie.

Anticipating another attack, the drive is a 7200RPM 120G ATA133 drive with 8M buffer. It's been recently defragmented. I didn't say it ALWAYS takes a minute to flip windows, but that it CAN take that long. I've found that rebooting the computer every other day clears that up pretty well. The longer the computer is run without rebooting, the worse the thrashing gets until it takes more than a minute to just pull up menus. XP has bad memory fragmentation issues that get worse as the system is used, particularly if you use multiple programs.

It is an unarguable fact that hardware continues to improve at a torrid pace, and that minimum shipped hardware on systems, for example RAM, continue to increase. To argue that this should create an acceptance for inefficiency in software, that people should just expect the requirements for running software to increase drastically, is completely illogical. Good programmers like Steve Gibson, Robert Szeleney, the people at .theprodukkt, as well as others, prove that good programming practices result not just in excellent functionality and pleasing appearance, but do so without prohibitive performance hits.

It is also true that todays computer users expect more from their computing experience nowadays, than when the "P200" was the standard. More features and abilities added to a program, done well, and given the abilities of todays computers to number crunch should justifiably increase the footprint of software, but just barely compared to what we're seeing. ESPECIALLY should this be true of an OS, which is to be the middle-man between a user and the hardware. It is pure marketing hype and an attempt to keep technology sales up to suggest otherwise.

If everyone applied the same standard to software as they did to hardware there would be far more accountability for poorly written software, and security failures. If a machine, or a part on a machine breaks down people take it back on warranty, they not only complain, but expect something to be done about it. If software fails, or causes serious problems, unless it affects the hardware, there is much complaining but less action demanded, because we are indoctrinated to expect problems or bloat and inefficiency.

This article addresses a gradual trend that IS a problem, and not just for Linux, but for software in general.

Underlining the author's points
by Bonkeroo Buzzeye on Thu 10th Jun 2004 07:36 UTC

My main machine is a 1.1GHz Athlon with 512 MB RAM running IceWM on Slackware 9.1. I like it this way.

I also have a 1.2GHz Celeron with 256 MB RAM running Windows XP.

I find most of the 'I have (tiny box) and it runs GREAT' and 'I have (monster box) and it SUCKS' to be a little hard to believe. I have a mediocre box and XP runs in a mediocre way.

Before the drive died, I had a second install of Slack on the Celeron and it easily outperformed XP.

But almost everybody seems to be missing the point: the author specifically states that Joe User probably *isn't* going to want Slack and Ice. He wants a GUI distro and Gnome and/or KDE. In other words, MS makes one system. Ipso facto, it's their best system (allowing for differences in 'home' and 'pro' and 'server edition' and blah blah that Joe User doesn't care about). So Joe User also wants the quote-unquote best Linux system, which he takes to mean the latest and greatest most user-friendly distro with the IDEs.

No kidding storage and core is cheap. To many citizens of industrialized nations. But if you're a dude in a third world country who can't afford to *feed himself*, upgrading hardware is *not* cheap.

The author's point was that Linux is blowing an opportunity to put first class systems on second class boxes in third world countries (or on poor Americans' boxes or whatever).

If the reaction is defensive and making excuses, Linux is truly screwed. If 'bloat' isn't a problem, why are so many Linux users so dismissive of Mozilla and hyped about Firefox? (I use Mozilla, thank you - have to pick your battles and Mozilla is just too cool to mess around with any LightningPanda.) We know bloat is a problem but when somebody else points it out and for far better reasons than 'My FPS in CS sucks' he gets insulted for it? Weird.

Even if there were no other reason than pride in clean efficient code, that should be enough too want to keep things as slim as possible.

Hi THere!

I have a Celery 900MHz Desknote with 256MB RAM 10 GB 5400 IDE HD, and a dual PIII 450/384MB 36GB SCSI LVD Matrox G400.

Galeon, Evolution, Gnome-terminal (GNOME 2.6 Debian Sid) load on login on seperate virtual desktops, and both machines are quite perky. All the apps are GNOME, and use Gnome shared libraries, thus reducing RAM use. Open Office takes a while to load, but once there is pretty fast. Totem plays back DVDs flawlessly on the dual PIII with out even a skip, and is so easy to use! Couldn't do that under Windows on that hardware!

Nautilus in 2.6 is FAST, and I like the new browsing modes. Much like Mac OS 9 which I have played with and like.

Just my 2c - Mandrake is definitely slower due to a whole lot of plug and play smarts it seems.

I disagree
by abdulla on Thu 10th Jun 2004 07:41 UTC

RAM, as pointed out, is cheap. I run Fedora Core 2 on the slowest computer in the house, a K6-2 350 BUT it does have 448 MB of RAM. I noticed quite a peformance boost between Core 1 and 2, and that's due to good programming. IE GNOME 2 is now very fast, KDE 3 remains as fast as ever, both are only getting better. Comparitively I used to run Windows 2000 on this machine and it was sluggish. I finally get the speed I used to from Windows 98 in Linux. All you need is more RAM, you can keep your old computer.

Gnome is going to get slower?
by Anonymous on Thu 10th Jun 2004 07:46 UTC

Well with all this disscussion about rewriting gnome in java or c# to make it a more devl-friendly platform, its definatly going to get a performance hit.

hit it right on the head
by andre on Thu 10th Jun 2004 07:48 UTC

linux is getting bloated, just like everybody else.

i remember clearly the days of red hat 4.2, the very first linux I tried. it ran comfortably at 32MB, and was serving like 70 simultaneous FTP users (MP3 downloaders!) over an E1 (2.048Mbps) line. i remember upgrading to 64MB and things were still fast even if there were 200 simultaneous FTP users on board. and it was a cacheless Pentium 133 MHz. to think that Slackware users then were telling me that RH4.2 was "fat." ;)

i just noticed that given around 128MB of RAM or less, Windows (Win2K; haven't tried XP with this little RAM) performs better with its GUI and Office than Linux. Of course, once the RAM reaches 512MB or so, Linux performs better than Windows even with slower CPU machines.

A Plea for Lean Software
by Marko on Thu 10th Jun 2004 07:49 UTC

Nothing new. In 1995 Niklaus Wirth has written this article that explains a lot:


one thing .....
by raver31 on Thu 10th Jun 2004 07:49 UTC

everyone here is moaning about not being able to run gnome or kde on really old memory constrained systems....

installing linux on an amd 233 with 64mb ram, but kde runs too slowly ?

emm, did the pc not have win9x installed before ?

why do you not want to use icewm ? it is basically a linux version of the win9x interface. and it will run faster than the win9x interface.Oh, and OO will run on it too.

No-one with even a bit of sense would try to install win2000 or xp on that machine, so why would anyone try a DE ?

by rzakaria on Thu 10th Jun 2004 07:50 UTC

I totaly agree with the writer. The huge potential of spreading linux on the win98/winNT machines is there we need to grab it.
There is no excuse for creating fancy apps which consumes a lot of RAM.
Developers pls have efficiency as a goal as important as functionality for your apps

I also disagree
by Brad Griffith on Thu 10th Jun 2004 07:52 UTC

Up until about a year ago, I had a friend set up on a PII 300Mhz with 192MB of RAM. Unfortunately the harddrive died, but when the machine was still alive it ran SUSE 8.2 and subsequently Fedora Core 1 very well. This was with KDE in SUSE 8.2 and GNOME in Fedora Core 1. The machine ran very well. It was left on for a whole semester basically. My friend wrote papers, chatted, browsed the web - even did some basic GIMPing - very comfortably. I don't think that 300Mhz with 192MB of RAM is outrageous for a distribution made in 2003. On my computer right now (which is a very nice computer, AMD 2600+, 512MB of RAM, 7200RPM harddrive), I have FC2 with GNOME 2.6. I have eight virtual desktops filled to the gills with applications - including the GIMP, OpenOffice, Inkscape (several windows with 1.5MB SVGs in them), Scribus, Epiphany, Gaim, gedit, Evolution, Muine, shiny Crystal icons, Straw, and about a dozen Nautilus windows. I can flip through the virtual desktops as quickly as I want and not feel a bit of slowdown. When in XP, however, clicking the start menu typically results in a 3-4 second wait, subsequently hovering over "All Programs" causes another long wait, and opening more than 4-5 programs brings the system to its knees and an inevitable crash. Linux makes me far more productive.

Getting better
by BenRoe on Thu 10th Jun 2004 07:53 UTC

I think memory footprint and CPU requirements in a lot of FOSS are a problem at the moment. But I think it is starting to get better. Optimisation is hard, takes time and you shouldn't do it during the main part of development - it's something you do afterwards.
For example - KDE now has a kde-optimize mailing list dedicated to speeding up KDE. Large amounts of work is going into profiling and optimising the environment. That's why 3.2 is so much faster than 3.1. With the next release of the next Qt, things will get better still.

v Linux not an OS
by Anonymous on Thu 10th Jun 2004 07:56 UTC
RE: Better hardware should never justify inefficient software
by jbmadsen on Thu 10th Jun 2004 07:57 UTC

Decius raises a few interesting points.

First of all, I agree completely with accountability and poor quality of software. I never understood why a software company can't be held responsible if their product causes damage.

Yes, programs were written to be more efficient previously. There is a reason for this: computer time was more expensive than human time. This meant that it made sense to spend lots of manhours making something run faster or use less memory.

But that is no longer the case. Computers are dirt cheap compared to human resources. Today it makes good business sense to increase the productivity of the programmer by letting him write in higher level (and slower) languages at the expense of requiring more computer time.

I for one am not going to sit around and handoptimize assembly code to make it run faster. A clever programmer can always do this, but it takes a lot of time, it will likely introduce some new bugs and it hurts portability.

Programs can be written much faster today than they could only a few years back. They may also require more resources and that is exactly the tradeoff we're seeing.

Re: J.F. (IP:
by andre on Thu 10th Jun 2004 07:58 UTC

really? your 512MB Opteron cannot run Firefox on WXP Pro fast enough?

either you've got the wrong/unoptimized drivers, or maybe your system is loaded down with spyware and viruses ;)

i tried using a Pentium III 500 (the one with 512K L2 cache that runs at half the CPU speed) with 256MB with WXP Pro and it was fast and very very usable.

RE: Gnome is going to get slower?
by jbmadsen on Thu 10th Jun 2004 07:59 UTC

There is no discussion about rewriting GNOME in C# or Java. There is discussion about allowing core components to be written in C# or Java. Big difference.

by Maciek on Thu 10th Jun 2004 08:00 UTC

M$ is evil and does bad business practice, but the suerly know how to make polished user experience. I can't help it but XP feels somehow faster than Linux + KDE. It's more responsive, and it's easier on XP to turn of the goddam eyecandy.

And I noticed that Linux's software developers tend not to give a shit about backward compatibility, a thing that always seemed to be a primary focus for M$ developers.

Also, concerning GNOME
by Brad Griffith on Thu 10th Jun 2004 08:01 UTC

There are known places to cut out bloat. The way stock icons are handled is inefficient, causing apps using libgnomeui to load the icons into memory several times. Metacity has some "low-hanging fruit" type optimizations that Havoc Pennington (who is a very talented coder - that was a lame troll thrown in by the author; the reason that GNOME terminal is slow on some machines is pango, the text renderer, which does receive very good acceleration from X at the moment) has recently published for those interested in optimizing the WM. The biggest GUI speedups are going to come from the new X technologies - most of which are already incorporated into CVS. For true legacy machines, however, the author paints too bleak a picture. There are usable options that use far less RAM - the best being XFCE. OpenOffice, in my opinion, is the biggest bloat problem at the moment. Abiword and Gnumeric are great alternatives for most tasks, but there is a need for a lightweight powerpoint presentation. Perhaps 2.0 will help to solve this problem.

Oh, and about booting times
by Brad Griffith on Thu 10th Jun 2004 08:09 UTC

XP, in my experience, only appears to boot faster. I have a very reasonable boot time with FC2. And more importantly, when it looks like the boot process is done, it actually is. Once I see those panels in GNOME, I know it's ready to use. In XP, sure, I see the desktop pretty damned quickly, but then the system tray (many times invisibly) is loading who knows what for another minute or so before I can actually use my machine. While this way of doing things may seem better at first, it often times confuses the user and compels them to open an app too early, making the boot process even longer as the system struggles with all demands placed upon it.

Mine :)
by Dawnrider on Thu 10th Jun 2004 08:10 UTC

Well, I'm running two boxes at home at the moment;

Main: XP, Athlon 2.8, 1GB DDR 400, 200GB HDD (ATA133 2x 7200rpm disks), Geforce 4 4800.

Secondary: Mandrake 10, Duron 1.2, 256MB ram, Geforce 3, 1 x 80GB 7200rpm disk.

Mandrake 10 is dramatically faster than the 9.2 I was running on there before, and in general, it is faster than WinXP for normal application. Internet Explorer is an exception, because it is pre-loaded, of course, but pre-loaded Konqy is not too far off. One thing I do find, is that Mandrake is faster to respond to activity than XP is in many situations, because XP is always doing stuff in the background, even while idle. This can lead to several second pauses between clicks and click registration. Sometimes the thing just gets plain busy and takes a while to redraw or sort itself out after a large memory-grabbing application like Photoshop closes.

Honestly, it is faster to me, but these things do vary between systems. My Athlon 2100+ laptop feels a lot slower under XP for some reason (still 512MB DDR and dedicated graphics), which I assume is the hard drive spindle speed coming into play.

Oh, it is isn't fair to suggest that KDE is actually becoming slower or more bloated with each release; it is actually becoming faster and more lightweight in memory terms, thanks to a lot of optimisation work being done, ultimately thanks to the joys of valgrind ;) KDE 3.2 will substantially out-perform 2.2 on the same system, which is a good thing ;)

it's about choices
by dillee1 on Thu 10th Jun 2004 08:10 UTC

Both winxp and linux+gnome/mozilla is really bloated for low end machines.

With linux you can choose to use lighter windows manager. If you go deeper you can turn off unncessary daemons and trim down your kernel as well. In extreme case you can just f@ck GUI at all.

With winxp you are bloated all the time and there is not much you can do about it except turning off some services.

I use fedora2 on P166+128MB ram, and i am pretty happy about it. Winxp on such a box is pretty much useless, even too slow to play solitaire.

LoL here it goes
by Anonymous on Thu 10th Jun 2004 08:13 UTC

1. A couple of years ago we heard all this about GNU Linux being secure...

Now we all know that ain't true, that's what BSD's are for (and in particular Open BSD)

2. GNU Linux is userfriendly

This one is actually heard every now and then, however we all know that it simply isn't AS userfriendly as XP nor BeOS or Skyos etc...

3. GNU Linux is so efficient.

Well it's getting fat, speed is devoted to BeOS and alikes, slim desktop systems..

4. Linux is Free

LOL, yah right!

Re: I hate to bring this up again.
by Brian on Thu 10th Jun 2004 08:14 UTC

I hate to bring this back up again, but it has to be done. I can't remember where I read this, perhaps in Tanenbaum's Operating Systems book, or perhaps online, but what he preaches seems to be right on the money. I think alot of things in linux suffer from this problem. "Perfection is reached not when there is no longer anything to add, but when there is no longer anything to take away". I think kde 3.2 has taken a step in the right direction, and has admitted that bloatedness had been a problem in the past. I think alot of the problems with linux today have occured because of the addition of new features, and I question how useful they truely are. I remember using linux 5 years ago on the same machine I have today, and I remember being able to do everything I could do today, while my machine didn't lag behind trying to keep up. As more and more code gets written, I believe, more and more code has to be audited, optimized and reviewed. The easiest way to speed things up isn't faster hardware, it's code review. The level of complexity linux is reaching is on par with windows systems, which is extremely hard to deal with.

RE: I don't believe you...
by John Blink on Thu 10th Jun 2004 08:14 UTC

Well I don't believe you. You said,
I'm not talking monster programs either. I'm talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.

I get the exact opposite behaviour on my AMD k6-2 450 Mhz with 384MB RAM.

Try the following registry tweak, although I should say that it didn't thrash before the tweak. The first tweak keeps more stuff in RAM, so there is less paging to the pagiing file.

[HKEY_LOCAL_MACHINESYSTEMCurrentControlSetControlSession ManagerMemory Management]

by dukeinlondon on Thu 10th Jun 2004 08:18 UTC

I've noticed that XP has a lot higher swapiness than linux. It starts using its page file immediately on my machine (512MB RAM) whereas it really is a struggle to get Linux with kernel 2.6 to just actually start using its swap partition.

it probably partly explains why XP is faster on low memory machines.

I dont know if many of you are aware of the recent trend Redhat has taken in compiling their distribution, the reason the memory requirements are up but on the reviews you hear that FC2 is really snappy (which I agree with) is because the main and biggest software packages such as OO and Moz are compiled with prelinking support, this means that the libraries are loaded when the machine starts up, and so the requirements are quite steep especially seeing as many uneeded services are also fired up on a default install. I might recommend doing some homework before wining about RAM requirements, because to end users, you just see the requirements go up and dont understand what is happening in the backend. At least thats what I understand the situtation to be, Anyone feel like correcting me?

Re: I hate to bring this up again.
by Brad Griffith on Thu 10th Jun 2004 08:23 UTC

As far as interface design goes, I think GNOME has your minimalist attitude pegged ;) . Seriously, though, we're always looking for optimizations. I can think of several proposals for major optimizations floating around the GTK community right now. The community is definitely looking to speed up performance. One of your other statements though, that you could do everything you can now five years ago, is ridiculous. There is now near 100% MS Office compatibility - for all major applications. There is real desktop integration. There are high-quality raster and vector graphics. There are advanced music playing applications like muine and rhythmbox. There are photo management apps like F-Spot and Gthumb. The GNOME project was started only seven years ago and wasn't usable for a year or two after that. Things have progressed a lot. In fact, I've found the rate of progress stunning.

software performance not a focus
by Jeremy Ginsburg on Thu 10th Jun 2004 08:24 UTC

The author is right. The problem is that, for most developers, software performance is not a focus of their efforts; so long as that is the case the problems described in the article are only going to continue.

But they don't have to. When developers put their efforts into optimizing for performance real results are possible: for example, Apple's efforts have made Panther (10.3) noticeably faster than Jaguar (10.2) on the same hardware. This article talks about how Apple did it:

I'm afraid the open source model may have more trouble getting people focused on software performance than, say, Apple or another corporate developer. I'm not trying to knock open source here -- I'm all for it -- but it isn't clear to me that anyone (or any group) has the incentives to take responsibility for optimizing overall performance in many open source development projects. I think it's probably fair to say that about bigger projects like KDE or GNOME. Anyone disagree?

by Edward on Thu 10th Jun 2004 08:27 UTC

The biggest cause of bloat in Linux seems to stem simply from sloppy programming.

* Memory leaks. Because Linux (the kernel) catches these, you can get away with this to a limited extent with desktop applications which are opened and closed frequently.

* Scripting Languages. I'm getting sick of these. Starting a single gdesklets application takes up 36 MB of ram (this is taking X server memory into account) to display a simple weather status application. Rather than being 'glue' to hold various applications together, these languages are being used for entire applications. Do it properly, or don't do it at all.

Nice artical, however...
by Anonymous on Thu 10th Jun 2004 08:30 UTC

You could've told him to switch to AbiWord, Firefox and XFCE4 if he wanted speed.

... Or Gentoo. ;)

RE:software performance not a focus
by Brad Griffith on Thu 10th Jun 2004 08:30 UTC

I disagree. Firstly, the core developers aren't all volunteers anymore, so if that was the premise of your argument (that volunteers don't want to hack on something as boring as optimization) you're off. In fact, many developers are paid and full time. I know of several instances where performance has been the primary focus of application development. The 2.6 kernel saw preemption added, the CFQ scheduler, etc. These are all improvements to speed desktop performance. I remember recently seeing that Miguel de Icaza was pushing Larry Ewing to find out why the icon view in F-Spot was slower than in some other app. And there are the other recent optimization discussions I've referred to. Secondly, volunteers in many cases do care more about optimization than employees. I lot of open source hackers take great pride in writing efficient code. That will never change.

You're ALL missing the point.
by Anonymous Coward on Thu 10th Jun 2004 08:32 UTC

What the author is referring to are the ones that are bringing Linux to the world.
The world doesn't care if some hippie down the street can run OpenOffice with just 32MB RAM (exaggerating here). The world knows Linux as what the enterprises are portraying them, say IBM, Redhat, Novell etc.

Preloading, not linking
by BenRoe on Thu 10th Jun 2004 08:33 UTC

<p>PastyHermit: You mean preloading, not prelinking. Prelinking is the process of pre-resolving all the symbols in shared libraries, which can improve performance for programs using lots of shared libraries.</p>
<p>Preloading seems a pretty poor way to improve subjective loading time for me. Instead of waiting longer the first time a program loads, the user is forced to wait longer on boot, even if he never wants to use the app.</p>

RE: Scripts
by jbmadsen on Thu 10th Jun 2004 08:36 UTC

At Edward:

Oh, so if people can't code applications so they conform to your high standards, they can't make applications at all?

I am sure lots of people enjoy the gdesklets regardless of them being written in Python or not. I don't use them myself as I don't look at my desktop background much. Please don't assume that just because you don't like something, nobody will.

It's not like you're losing something because someone codes something you don't want. If people code free software, you can only gain from it.

re:software performance not a focus
by dukeinlondon on Thu 10th Jun 2004 08:36 UTC

I don't think so. Whilst fixing problems my 2.6.6 kernel, I had the opportunity to boot back and forth between 2.4 and 2.6 and kernel 2.6 is really faster ! The next thing I will do is use kolivas patches to autoregulate swapiness and use the staircase scheduler.

kde 3.2 also brought speed improvements as did 1.0 and 1.1. Whilst taking part to cooker testing, I immediately noticed than mandrake 10 was faster than 9.2.

It is true that the priority of the last few years for linux apps was to do a features catch-up but performances have definitely improved and now than kde and gnome are quite feature rich, I am sure these guys will continue optimising.

And lastly, it's always been my experience that linux remains usable, whatever the load (kernel compile, cd burning and so on) whereas Windows (XP included)priviledges the heavier apps, making the system hard to use whilst ripping CDs for example. I'd rather have applications taking 6 sec more to start but then being usable with heavy background activity rather than the other way round.

The problem is...
by Anonymous on Thu 10th Jun 2004 08:46 UTC

All these full-time developers at Novell, RedHat etc. have top of the line machines and don't notice the slow down.

RE: RE: Scripts @ By jbmadsen
by Edward on Thu 10th Jun 2004 08:54 UTC

It's not like you're losing something because someone codes something you don't want. If people code free software, you can only gain from it.

True. However, I'm very worried at the way perl/python is becoming a general programming language for applications that are meant to run all the time. I've always felt that scripting languages are for binding two seperate programs together, or simple programming of uncommon tasks (say, gTweakui), rather than writing a text editor.


Interesting note, I just fiddled with the themes in my GNOME install. Switching from a Pixmap theme to an XFCE theme made huge differences in the 'usability' speed of the system*, and dropped ram usage from 289 MB by around 100 MB for a fairly small app load (epiphany, system monitor, nautilus and beep-media-player). I wonder how much of Fedoras speed issues are caused by an over-abundance of eye-candy.

(* For reference, System is K7 XP 2000 w/h 768 MB DDR on GNOME 2.6/Debian)

by Seo Sanghyeon on Thu 10th Jun 2004 08:55 UTC

Brad Griffith made a good point. Reason that GNOME terminal is so slow at drawing chracters is Pango, and Pango in turn uses RENDER extension. Read . Interesting quote:

A big bottleneck right now in GTK+ performance is the poor performance of the RENDER extension drawing anti-aliased text. Even without hardware acceleration, it could be tens of times faster than it is now. I'm hopeful that the X server work currently ongoing on will result in that being fixed.

So they are aware of it. And it is not Pango's fault. Pango may look bloated if you don't care about internationalization, but it does much better job in placing East Asian AA texts than (say) Qt. And that is very important to me.

RE:The problem is...
by Brad Griffith on Thu 10th Jun 2004 09:02 UTC

And the miracle-working Apple optimizers mentioned just a little bit ago are using Lisas, right?
What the employees use doesn't effect a corporation's need or lack thereof for optimizations. And, to emphasize that volunteers are most certainly interested in optimization, I know of many independent GNOME developers who run very modest machines. Ross Burton (developer of Sound Juicer for GNOME) needed a hard drive donated to him so that he would have enough space to do a build of CVS GNOME, for instance.

The good news is
by Lumbergh on Thu 10th Jun 2004 09:03 UTC

Unlike Windows XP, you can run something like fluxbox and gtk+1.x apps and still have a snappy desktop on a P166 with 80 meg of ram.

Really, the only missing component for the minimal hardware desktop is for dillo to be just a little bit more complete.

As others have pointed out, it's not linux that is the problem, but the desktops and their associated monster libraries, along with X which is quite speedy but still takes a sizeable chunk of ram.

I could see some of these newer hobbyist OSs that have a stricter development path taking a chunk out of the low-end linux market, which would be pretty ironic.

meaningful article
by santhosh on Thu 10th Jun 2004 09:05 UTC

I totally agree with this article. Even I have a story which surrounds speed of applications on linux(linux desktop).

My story is like this:
I have a celeron 433MHz , 32 mb RAM at my house. I run Mandrake-Linux 9.2 and Windows 98 on it. I am happy using linux on it, but not as happy using Windows 98 when it comes to speed.

I was using blackbox as the window manager before. But was quite happy when I saw Xfce and started using it. But still all the gnome/kde applications are very slow.

Mozilla and open-office are unusable. Whereas I use IE and MS Office very happily on Windows 98. On Xfce I have only 2 desktops. It becomes very slow if I go beyond 2.

Its very painful using the gnome terminal. I still stick on to rxvt. Even tried using 2.6 kernel, but found that it performed very poor when compared to 2.4.22.

At my work place I have a 1.4 GHz, 256 MB machine running Debian(Sarge). All applications work slightly better here. But still have problems with application speed. When I login my machine in the morning, It takes about 2 minutes for all the windows to redraw. If I keep running Mozilla continuously for a week or a two, the machine slows down. It takes about half a minute to see the splash screen of open office after starting it from the command prompt.

I hope the application developers give imortance for the performance of the applications, rather then keep on adding features. I agree that linux means the kernel, but not the desktop environment. But a normal computer does not know anything about the kernel and desktop environment. He/She would like to use it, to get their work done quickly. I hope importance will be given to speed and efficieny of the applications in the near future.

Linux ain't fat your PC is just anorexic
by Anonymous on Thu 10th Jun 2004 09:06 UTC

I think the author is trying to say what linux distro's need to stive for this way they could take over some windows installs in the business arena. I think he just fails to realize that it just isnt possible. He says how linux used to be so fast and everything, the reason for this was because it didnt have nearly the features it does now. Before your only options were these very slim apps that he even mentions but now there are better things out there. Its called progress so get over it. If he want blazing fast speeds like those of the old linux days then just stick to the command line and no gui at all. Everything you need can be worked from a console from IM to web browsing to document editing.

Also there is no reason why a user cant upgrade his hardware anymore. Just think about, A user could buy Windows XP Professional for around $250 and have it work decently on his older pc or he could buy 1gig of ram for his computer for about $100 (search and then use Fedora core 2 at a cost of $0. You will have $150 more in your pocket plus 1 gig more of ram in your pc and a far better OS on your hard drive. Also if you were going to use XP you would have to buy a hardware firewall plus antivirus software just to have a usable system. And believe me you do need a hardware firewall. A software based solution is as week as the OS it runs on case closed.

A question about linkers
by Lumbergh on Thu 10th Jun 2004 09:07 UTC

I'm curious to how smart the linux linker is. As in, how smart is it when it bringing in code from libraries. I remember an article sometime back about how certain linkers were very agressive in just taking out the exact code that was needed from a library for an application.

Re: Preloading, not linking
by John Blink on Thu 10th Jun 2004 09:08 UTC

No he mean't prelinking.

Beginning with fedora core 1, redhat introduced for their distro prelinking. This is the reason you may sometimes see CPU usage go high while your machine is idle.

Although it said C++ apps like KDE actually benefit from this whereas GNOME does not.

I am just going by stuff read here at OSNEWS. I am not an expert in the matter.


Havoc Pennington
by Seo Sanghyeon on Thu 10th Jun 2004 09:08 UTC

Don't be silly. Havoc writes great codes. It is just that he is a little ahead of time. :-) Pango and Metacity use either under-optimized or not-very-well-supported-on-network X extensions. And X will improve.

by Rick on Thu 10th Jun 2004 09:08 UTC

To that! Remember the C64 and the astounding things developers could achieve with so few resources by todays standards, the limitations of the platform forced developers to be careful about what they wrote. Ignoring memory and cpu limitations usually ment your creation would simply crawl.

On the contrast, todays bigger faster machines impose no such limitations, developers are free to write any old slop and get away with it, and it really shows! If it runs to slow, you have to get a faster machine.

As an ex BeOS user myself I am all to aware of what a well designed OS should feel like.

IMO the biggest problem with Linux is Linux itself. Its monolithic design is comming back to haunt us and as time goes on its only going to get worse. You can only speed hack it so much, not to mention the usability issues that seriously impact it widespead acceptance. (Can you say drivers?)

When it comes to the perception of speed a better X would certanly make a world of difference, but that will do nothing to address the deeper underlying problem. Lack of developer concern for tight light code.

by Lumbergh on Thu 10th Jun 2004 09:11 UTC

Your 433 Celery isn't a speed demon, but isn't exactly chopped liver either. But that 32 meg of ram is killing you. Since you mentioned that you have a job I would advise for you to spend $10 and pop in another 64 meg stick at the very minimum.

RE: AMEN! @ Rick
by Edward on Thu 10th Jun 2004 09:15 UTC

IMO the biggest problem with Linux is Linux itself. Its monolithic design is comming back to haunt us and as time goes on its only going to get worse. You can only speed hack it so much, not to mention the usability issues that seriously impact it widespead acceptance. (Can you say drivers?)

You are clearly talking out your ass. The structure of the kernel has sod all to do with the way the applications are structured.

(If having a monolithic kernel was slower, then I wouldn't get better fps in Enemy Territory in Linux than in Windows would I?)

I though it was just me
by kaiwai on Thu 10th Jun 2004 09:17 UTC

I'm running FreeBSD 4.10 loaded with GNOME 2.6.1 (with some 2.6.2 components) and I could never understand why the terminal was so crappy. This is on a K6 550Mhz, SIS540 Graphics chip, 256MB Ram and a 60gig 7200rpm hard disk, and yet, the responsiveness of the gnome terminal is though I have 1000 users logged onto my machine leeching resources left, right and centre.

Please, someone fix it up, I don't see it in KDE on the same machine so why does it exist in the GNOME terminal? Please, my kingdom for responsive GNOME terminal!

by timh-rack64 on Thu 10th Jun 2004 09:18 UTC

I have a machine running on 64MB RAM and a 233mhz CPU as well as a 2GB HD running Windows XP.. below the runs fine... just as slow as windows 98 or 95... a gnome desktop redhat 8 couldn't keep up with it. Very incredibly slow and kept freezing.

by Brad Griffith on Thu 10th Jun 2004 09:19 UTC

Funny, I haven't heard many complaints about the speed of the Linux kernel. In fact, it seems the server market is drooling over it. And the Apache web server - not really known for bloat and loose heavy code. The platform is developing really well. A faster X will make a huge difference. Other improvements have helped desktop performance a lot - CFQ scheduler, kernel preemption. Linux adoption is increasing exponentionally. And the platform is improving faster than any other as far as I can tell.

by Lumbergh on Thu 10th Jun 2004 09:19 UTC

(If having a monolithic kernel was slower, then I wouldn't get better fps in Enemy Territory in Linux than in Windows would I?)

99% of the dual-boot rigs out there get better fps in windows than in linux. Linux has closed the gap, but windows invariably runs games better.

RE:I though it was just me
by Brad Griffith on Thu 10th Jun 2004 09:22 UTC

Just curious - what graphics driver are you using?

by Ernst Persson on Thu 10th Jun 2004 09:25 UTC

KDE 3.2 is nice and snappy on my 266 MHz with 160 MB RAM.

Of cource, it's running Gentoo...

RE: Gnome Terminal
by Lumbergh on Thu 10th Jun 2004 09:26 UTC

It's pretty amazing that something as heavily used as Gnome terminal could be so sluggish, even in comparison to the already slow Gnome platform. And these guys are using straight C. Just think if they went to C# or Java. Oh, the horror.

They won't save time and money...
by n0dez on Thu 10th Jun 2004 09:27 UTC

...unless they already know Unix and/or Linux.

by Brad Griffith on Thu 10th Jun 2004 09:29 UTC

Have you been paying attention at all? The GNOME terminal performance issues - which occur only with certain drivers that have very poor RENDER acceleration. If a better X is developed - as is happening very quickly now - those problems will be gone. Many key components in GNOME - metacity, pango, etc. - rely heavily on the RENDER extension of X, as they should. It has nothing to do with the language or the code. It's the architecture, which unfortunately, until now, has not developed in order to support the technologies in GNOME.

RE: Linux ain't fat your PC is just anorexic
by Andrea on Thu 10th Jun 2004 09:29 UTC

> Also there is no reason why a user cant upgrade his hardware anymore.
No there is.
My toshiba satellite 4090 can't have more than 128mb of ram...
It's a nice notebook and it rocks.
I never had an hardware issue with it.
Yes it's old, 1999, but the LCD is very fine and it is working fine with my NT4.
Trying to use Thunderbird and Firefox my fan runs often, continuosly I'd say, with SuSE9.1, with NT4 rarely.

What have I have to do ?
Still using the unsupported NT4 or sending to the trash because modern GNU/Linux distros can't run on it at an acceptable speed ?

@ Lumbergh
by Edward on Thu 10th Jun 2004 09:33 UTC

99% of the dual-boot rigs out there get better fps in windows than in linux. Linux has closed the gap, but windows invariably runs games better.

I call major bullshit on you. If we discount crap like ATi drivers (because the drivers just plain suck. They suck on Windows, but they suck even worse in Linux), and talk properly ported games (Quake and UTx engines, rather than Wine(X) emulation), it's pretty common to get 10% to 30% better fps in Linux.

Hell, my Windows drive has the bare minimum of hardware drivers and Direct X on it, just for playing games, and yet, my overly messed with, half broken, more than slightly borked Linux drive still gets better frames in UT2K4 and ET.

Just curious - what graphics driver are you using?

Just the standard SIS driver that comes with XFree86 4.3.0 (XFree86 4.4/XORG 6.7.0 isn't in the ports tree yet), I have 16MB dedicated to it (its a crappy onboard video card). Even in all its crappyness, it should be *that* crappy when it comes to graphical processing.

Calling in the desert
by PP on Thu 10th Jun 2004 09:37 UTC

This is an important call to arms for those who are able to make those lightweight boxes run again - on Linux. We know it is possible, but we need those LiveCD's to quickly install that GNU/Linux environment with the needed (Open?)Office, browser, email and media player apps. Possibly with XFCE, or IceWM?
But this article is not an attack on 'Linux', and there is no need to defend it. We do want those old machines -so many of them still around- to be usable under Linux in an easy-to-install way. It's a challenge for those who can - the author is hitting a nail on its head here, let's listen.

by Brad Griffith on Thu 10th Jun 2004 09:39 UTC

It's not "graphical processing" in general that is the issue - rather, the question is whether that SiS driver has decent render acceleration. I'm not personally certain about that specific driver. Maybe someone hear has more info on it (?). If not, hopefully the soon to be realeased improvements will help you out. The current situation with Pango and RENDER does irk me, too, but it is being worked upon. Thankfully, with my Nvidia card and the nv drivers, I get very good performance.

Lame comment about Havoc
by Mike Hearn on Thu 10th Jun 2004 09:42 UTC

Dude, if you're going to flame somebody at least be informed.

Gnome-terminal is not slow, VTE the terminal emulator widget itself is slow, which was written by Nalin at Red Hat, not Havoc. And VTE is mostly slow because it's doing a lot of work processing unicode (Pango, written by Owen Taylor) and rendering text to the screen using XRENDER (written by Keith Packard).

So, really you just make yourself sound like you've done no research at all with such a comment.

About system requirements: people have the code. People know where the bottlenecks are. They aren't getting fixed fast because most people don't seem to care, it's fast enough and those who want it to go faster are bitching about it on OSNews instead of writing the damn code.

by santhosh on Thu 10th Jun 2004 09:42 UTC

Its not that I dont want to upgrade memory of my machine. I am unable to find RAM for my machine. My machine requires a 100MHz SDRAM and its not available in the market. If I have to upgrade one component I have upgrade my whole machine. I think this is what the author of this article tries to convey. We need applications which give a better performance on a low configuration machine like mine.

hey guys
by bitterman on Thu 10th Jun 2004 09:43 UTC

Windows 98 (1998) min req 32mb ram right?
Windows XP (2001) min req 128mb ram right?
Fedore Core(2004) min req 256mb ram right?

You mean Fedora has higher requirements than XP 4 years later? say it isn't so!

Anyone who has run XP on a 300mhz machine with 128mb of ram knows that is full of it, You can open apps sure, but if you use it for more than e-mail and a browser you're not going to like it. Everyone knows 5-600mhz and 256mb ram is the reccomended requirment on XP for a mosly full functional system.

Microsoft offices min requirements are just as much as fedora's if Windows bundled office as linux does then linux still runs on par with XP which was made four years ago.

Anyway point is by windows standard Fedora should double XP's specs every 4 years and it hasnt even done that yet, there is no story here unless you went to sleep in 2001 and just woke up.

Anecdotal evidence --> meaningless conclusion
by Andrew on Thu 10th Jun 2004 09:46 UTC

Summary of the arguments presented in this thread:

- My X yearx old computer with Y MB RAM is slow with the latest Z Linux distribution.

where 3 < X < 6,
and 64 < Y < 256,
and Z is an element of the set of full-fledged Linux distributions like Fedora, Mandrake, SuSE, you name it.

The meaningless conclusion is: "Linux is getting very fat".

How the author jumps from his anecdotal evidence to his meaningless conclusion is clearly fuel for a long thread, seeing as this thread is growing fast...

If RAM was expensive, he'd have a point...
by Anonymous on Thu 10th Jun 2004 09:48 UTC

I believe that all PC machines sold in the last 5 years can install at least 256MB RAM and that currently costs no more than $60 (less than the cost of pretty well any commercial XP piece of software).

Hence, what this article is about is that someone is either trying to run the full Linux desktop (GNOME/KDE + several large apps) on a machine that's either more than 5 years old or came with 128MB RAM and they won't spend a small amount of money for a RAM upgrade.

In other words, the article is an utter waste of space. You cannot *buy* a PC now with under 256MB RAM (and 512MB RAM is becoming the norm) - clearly if you insist on running Linux on less than that, then will you have to adjust what you run (don't run GNOME/KDE - pick a lightweight window manager - try some of the more lightweight packages alternatives as well e.g. Abiword instead of Open Office and so on).

Apparently, this author thinks that apps shouldn't be large and shouldn't require more than a few MB's RAM to work with. And yet hardware is increasing rapidly on the "baseline PC" - more hard drive space, more RAM and much faster CPUs. You might think this encourages bloatware, but in fact it just improves the user experience and allows more complex packages to be made available that need more resources to run.

Developer decisions catching up to everyone
by Gabe on Thu 10th Jun 2004 09:52 UTC

IMHO, Things are starting to catch up to users in this community for userland applications. A friend always tells me that just because computers are getting so much faster doesn't mean we should ignore memory usage of an application. These issues are still very important.

Also, I think part of the problem include developers using the wrong tool for the job. For example, openoffice was mentioned a lot here. OpenOffice is a Java based application, and I'm sorry to say it but Java is *not* a good language for GUI things. Don't get me wrong, it is a good and powerful language, but it is not suited for these things. People will think I'm trolling or what not, but I do know what I am talking about, and when I set up a system I take under consideration what type of application it is and how it was written.

The projects that do pay attention to these finer points are the ones that are getting through just fine even now.

I have to say, it saddens me when an application gets tied to closely to one of the desktop environments. Because all that means to me is bloat.

I'm not 100% sure about this, but I believe the Java part of are optional. The version with Fedora Core 2, for instance, does not use Java at all, as far as I know.

I agree but give some time to volunteers
by Henri Chapelle on Thu 10th Jun 2004 09:54 UTC

I like the comparison of current computers with the Amiga, but as you said, a well tuned Linux distro run fast, as the tuned Workbench ran fast on its specialised hardware.
The only device in wich I recognize the Amiga usability speed is my PALM pda :-)

Computers are a lot more complicated than before, we have AGP, PCI, USB(2), Firewire, ATA, serial ATA, scsi, WIFI,..
We are used to process 1GB files and instead of 880k D7 on my Amiga, I use 4.5GB DVD-R to store my data, it's 5000 more but a P4 3Ghz is *only* 500 times faster (if you want to count in MHZ, but not a good computing).
So, ok, Linux is slower than the Amiga Workbench but it does a lot more.

Anyway, I use Gentoo on a AMD XP 3200+ with 1GB Ram, KDE and Gnome are very usable but booting is slow. When I switch to the beloved Windowmaker, things are way faster and memory usage is down.
Did you ever tried to play divx with an old AMD K6-3D with 32 or 64MB at 300mhz on Windows? But with Movix on the same hardware, it works, no need to tune, no need to install, just burn the cd.
Linux + Mplayer is the lightest thing that can be run on an old PC for playing movies.

The *free* Linux OS can be used as a desktop OS, a generic server OS, a GRID OS, a home multimedia OS (movix and so).
It as the potential to be very fast, your article is right for KDE, GNOME, OO and distros using that tools but I disagree for Linux *THE* OS.

Maybe the dev. need to make a choice here, quickly provide applications or provide fast applications in a slower cycle,
it is not a Linux particularity, it's true for all the software industry, it is a major issue in software and hardware world and that's why most people are "afraid" of electronics in cars or washing machines.
But if we do not want Linux to follow the Beos way, we need working applications asap more than hyper-tuned applications tomorrow, even if it not optimal, the code can be tuned later, we can already see this in the latest KDE release.
I like to tune code at work, maybe it is time for me and other code tuners to help the Linux community..

kill your darlings
by stilus on Thu 10th Jun 2004 09:55 UTC

"Perfection is reached not when there is no longer anything to add, but when there is no longer anything to take away".

With this quote Brian hits the nail on the head. I personally like linux because I can tweak it. Most of the people the `big' linux desktops/distro's are meant for, don't even know what "tweaking" implies. They sould not have to, because tweaking something to perfection should be the Sein of the coders/hackers[/i], not the users.

Stability, Usability, Speed and Security aren't just slogans, they are what makes an OS or piece of software better than good. I think these four will forever be on the horizon, but shouldn't we at least try to attain them?

Someone here wrote: "I would be happy if most developers went into a feature freeze for 6 months and just optimize the shit out of their apps." I couldn't agree more. Kill your darlings, dear hackers! Slim and trim down, think again, find The Beautiful Way*. A laurel Crown to her/him who creates the slickest, leanest piece of code! Good luck and happy hacking!

*) "When I am working on a problem I never think about beauty. I only think about how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -- Buckminster Fuller

P.S. As to comparisons between XP/win98, linux, BeOS or whatever, not to mention the specs flying around in this thread: who cares? Arguments from example are almost never useful, because one can always find a counter-example. Your teachers should have told you that.

It's just too personal
by Jack Malmostoso on Thu 10th Jun 2004 09:57 UTC

Hi there,
I run FC2 on my PIII550MHz, 512MB RAM, GeFo2 without any problem.
I know I'm a patient person so I don't mind waiting some seconds to get my apps open, and I understand this might be an issue, but let's face it: running a 2004 distro on a 2000 computer needs hardware boosting. If I were to run FC2 on the original 128MB RAM, TNT2 and horribly slow Seagate HD I would have probably the same bad experience.
Oh, and I must say also that I make a strong use of gdesklets and bulls**t like that, so I guess my system could be much faster!
Win2000 on my actual machine runs just slower, even slower than KDE3.2 booted from knoppix!
It's just a matter of personal experience, but sure some Linux apps are just getting out of hand. Anyway, I still like it a lot more than Windows.

@ ALL of you
by Phonetic on Thu 10th Jun 2004 09:58 UTC

Why swap is important:

Swap is the virtual memory on secondary storage where applications and modules go when they are sleeping to free up more main memory as necessary (simplisitc but covers teh basic points). Swap effectively increases your available memory at the expense of extra IO operations to page data in and out of main memory.

Windows uses swap, Linux distro's use swap.

The actual physical memory required to run an application is the total max active memory required at any time during the life of that application. Mostly, this represents the maximum memory required to load the application and its associated libraries.

An example, FC2 with KDE, Gnome, and Xorg plus basic system services, can run on 512mb of main memory with no swap. The largest application is Xorg itself, consuming at present approximately 30-50mb in its image and data. Thus after starting Xorg, approximately lets say 40mb of main memory is used, Xorg tends to reside in main memory as its libraries are shared and always in use by various applications that may be running. On a 128mb system, this leaves approximately 88mb of main memory for applications. Of course, a slice of this is used for caching - something most distro's do to increase system performance, this cache will decrease in size as less and less memory is available until it doesnt exist - a hit on performance naturally, but nothing unexpected. It's used for caching all sorts of IO, libraries etc. If this is 20mb on 128mb system after loading X, then 68mb of main memory approximately remains - but that doesnt include the kernel, so we'll take 2mb off for that. 66mb.

At any given time, most of the applications and services on a computer are sleeping. Linux distro's completely swap these out to disk when more main memory is required, and in general, the less active a process or library, the more likely it will be swapped out.

What this means is that you can load any application that does not exceed 66mb in image and data, at any time.

Most applications in modern distro's use shared objects adn dynamic linking, meaning that if a library is already loaded, it can be used by any other applications that require it, without loading a new instance of that library. This really optimizes memory usage. Windows uses DLL's for dynamic linking something similar, but im not sure whether they are shared, anyway, im concentrating on Linux distro's at the moment, to explain system performance.

When a large amount of memory is required for an application, a lot of pages need to be swapped out to make room for it, if this application requires so much memory that libraries it uses wont fit in as well, something known as "thrashing" occurs - that is, the application itself is pages out, the library call made, then the application paged in, and the library paged out, the application then makes another library call, gets paged out etc. This is very detrimental to system performance, as you are probably aware.

Thrashing can only be prevented by choosing applications with less maximum memory requirements, or increasing available main memory. Low end machines will often trash on 3d games, due to the size of texture ad sound files and complex scene hierachies.

This is why dynamic libraries are such a good thing, as static linking increases the actual image size of the application.

There are 2 main bottle necks in any IO operation:

1) The speed that the cpu can transfer data over the system bus


2) The speed that the IO device can process the data

Possible fixes for 1) - faster cpu, faster bus, direct memory access

Possible fixes for 2) - faster IO devices

DMA means basically that the device is told what memory needs to be transferred, and can access that memory itself, freeing the cpu up for other work, otherwise the CPU has to manually move the data from main memory to the device. PCI video cards for example, do not to my knowledge support DMA, AGP cards do (AGP = accelerated graphics port, and basically means DMA for video, with a higher bus speed). Likewise, IO operations for non busmastering DMA capable hard drives will be inherently slower than for those with such drives.

If you are going to do a large amount of paging, you should optimize your paging system by having fast, efficient drives with DMA access, and a good bus speed on your motherboard (newer motherboards are obviously better).

What it comes down to is in any OS, memory is not the only concern to system speed, and in fact, if you have 128mb you can run pretty much any major application without slowdown, provided you arent running multiple large applications (this goes for ANY OS, not just Windows or Linux distro's). Other things that play a part are the size of your swap partition, the speed of your system bus, drives, cpu, and main memory and whether you are using DMA enabled devices or not. This really can play a crucial part in over all system performance.

FC2 with Xorg, KDE, Gnome, a handful of system services, Konqueror, Sylpheed, Kopete, OOWriter, konsole, sshd, and a number of other services, WILL run wihtout any swap at all in just 512mb of memory. I did this for weeks before I realised that my swap partition wasnt being mounted.

What this means is that all those applications fitted together in just 512mb of memory total. At any given time, only 2 or 3 applications or services will actually be awake, the majority are sleeping (inactive pending being woken by some event). If those applications and associated libraries were paged out to swap, FC2 would run nicely on 128mb of ram, with about.. 400mb of swap. There would be some slowdown in switching between applications if the applications were paged out, but thats to be expected.

The reason Gentoo is so slick, is that as a source based distribution, you can compile everything as dynamic/shared, without worrying about what libraries you have on your system.

by timh-rack64 on Thu 10th Jun 2004 09:59 UTC

once upon of a time i thought linux people paid more attention to the number of CPUs it could run on.. rather than optimization. when i installed fedora i reveresed that.

It's chaos reign
by BrownJenkin on Thu 10th Jun 2004 10:00 UTC

I believe that the great problem can be closed in a word: fragmentation. Linux has a great potential, given by the fact that's free, and anyone can work or discover & create apps. But taking a part from a place, another part from another place and so and so is giving a huge weight to the distros. That chaos reign can be a serious task for linux and can bring it down.

I'm running SuSE 9.1 prof on a Compaq Presario 2700 (256 Ram) and sometimes it gets really slow, expecially when compiling or tarzipping, so i sadly agree with the article. Don't tell me "hey, suse sucks use xxxx", 'cause distros war is a war of the poors, extremely dangerous to the entire linux community.

My personal hope is that major distros will concentrate on putting the various apps al togheter, linking them well, tweaking them, just to buy a solid os with integrated solid apps.

RE: Developer decisions catching up to everyone
by Gabe on Thu 10th Jun 2004 10:02 UTC

"I'm not 100% sure about this, but I believe the Java part of are optional. The version with Fedora Core 2, for instance, does not use Java at all, as far as I know."

Recommended dependency, but never tried it myself. Then again, point is OO isn't the only one, just a good example ;) Another is Eclipse (the development tool), Yada, yada.

The new wave of tools emerging won't help the speed issue. Those being C# and mono.

I still maintain that it is essential for developers to use the right tool for the right job. But, then there is the problem of knowing the tools that exist!

Using the right tool for the right job is very important, I agree. But I disagree with your example of Mono and C#. I've been very impressed with the responsiveness of mono-based apps - I'm primarily thinking of Muine and MonoDevelop. Load times are fine, responsiveness is good - nothing like

RE: It's chaos reign
by Gabe on Thu 10th Jun 2004 10:08 UTC

"My personal hope is that major distros will concentrate on putting the various apps al togheter, linking them well, tweaking them, just to buy a solid os with integrated solid apps."

This is a good statement, but I have to say again that it is a difficult process. There are those who do work on these things, but it takes time and there are so many applications. It is not trivial to put all the pieces together.

When people say that Linux (distros or kernel) is kept together by hacks and patches, I cringe and only wish if they really knew how things get done within most of the community. [Not directed at anyone, just a general comment].

Oh, and I forgot
by Brad Griffith on Thu 10th Jun 2004 10:08 UTC

The mono devs haven't even begun to work on optimization. I've heard that they expect significant performance boosts shortly after the 1.0 release when they bunker down and optimize everything. Apparently there is a lot of room for improvement still.

RE: Oh, and I forgot
by Gabe on Thu 10th Jun 2004 10:10 UTC

"I've heard that they expect significant performance boosts shortly after the 1.0 release when they bunker down and optimize everything"

In comparison to what though? I have dealt with it only a little, so I can't offer any hard data, but I hope that ya keep and open mind and make sure that it doesn't take you in the wrong direction. I know I will be ;) And I love to test out the new tools!

@Brad Griffith
by Lumbergh on Thu 10th Jun 2004 10:11 UTC

If a better X is developed - as is happening very quickly now - those problems will be gone.

That was good for a laugh. But KDE never seems to have these problems.

But the future....some years from now..if we only had a better X. Any more comedic relief?

@ Brad Griffith
by John Blink on Thu 10th Jun 2004 10:12 UTC

You said,
The GNOME terminal performance issues - which occur only with certain drivers that have very poor RENDER acceleration. If a better X is developed - as is happening very quickly now - those problems will be gone. Many key components in GNOME - metacity, pango, etc. - rely heavily on the RENDER extension of X, as they should. It has nothing to do with the language or the code. It's the architecture, which unfortunately, until now, has not developed in order to support the technologies in GNOME.

What hardware config would give me better speed?

RE: Developer decisions catching up to everyone
by Gabe on Thu 10th Jun 2004 10:12 UTC

"But I disagree with your example of Mono and C#. I've been very impressed with the responsiveness of mono-based apps - I'm primarily thinking of Muine and MonoDevelop. Load times are fine, responsiveness is good - nothing like"

Doh, working backwards here... sorry. Last I used it I was not impressed, but I will say that I have not heard good things as of yet. Again, I wait until the day so I can judge for myself, and hopefully I do gain another option for my development platform.

RE: Oh, and I forgot
by Brad Griffith on Thu 10th Jun 2004 10:14 UTC

In comparison to what already seems like a very nice platform for GNOME application development to me. As I said, I'm very impressed with the responsiveness and stability of apps like Muine and MonoDevelop. If you're referring to the patent issues ... I think we will make it through with the ECMA core just fine. The cool part of mono - for developers like me at least - is the Mono-specific, not ASP, windows.forms, and friends.

by Lumbergh on Thu 10th Jun 2004 10:14 UTC

Well, either you're the typical fanboy that will lie through his teeth to defend Linux with his last dying breath or you're about the only person in the world with a dual-boot rig that has games that get better fps in linux. Take your pick.

by Brad Griffith on Thu 10th Jun 2004 10:19 UTC

I don't play commercial games, so I don't really have experience with this, but do you have any numbers to back up what you're saying? If not, maybe you shouldn't be so rude.

A few points
by nonamenobody on Thu 10th Jun 2004 10:22 UTC

Firstly, Red Hat is not Linux, i.e. it is a Linux distro., not the Linux distro..

Now, I'm not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I'm just being realistic -- they should still run decently on hardware that's a mere three years old, like my friend's machine. (from the article)

My machine is very nearly three years old, was mid-range at best and is twice the spec of your friends machine (256MB RAM, 1.4GHz Athlon). IMHO 256MB of RAM has been the norm on new machines for about 4 years (except for bargain basement machines, which are never good value for money).

So when people talk about 10 GHz CPUs with so much hope and optimism, I cringe. We WON'T have the lightning-fast apps. We won't have near-instant startup. We thought this would happen when chips hit 100 MHz, and 500 MHz, and 1 GHz, and 3 GHz, and Linux is just bloating itself out to fill it. You see, computers aren't getting any faster. CPUs, hard drives and RAM may be improving, but the machines themselves are pretty much static.

That is the age old paradox. User and developers generally don't want faster software they want more features. Additionally, if people never needed to upgrade, they never would, hardware sales would drop and either pace of development would slow or prices would increase (or both).

IMHO Fattening software could enable the fast startup and instantaneous response the author wants. It could encourage the hardware to fatten along with it, then the software could be put on a diet.

@Brad Griffith
by Lumbergh on Thu 10th Jun 2004 10:26 UTC

I don't play commercial games, but I don't have any experience....

Thanks for the laugh again there brad. I mean you don't play any commercial games, but you thought to chime in anyway.

Well, I have played Quake I, II, III, UT2003 all on linux and windows and invariably they run faster on windows. Most of the time it's not even close, with the closest being Quake III. Of course I know that either Edward has seriously screwed up windows drivers or he's just plain lying because the UT engines always run faster in windows with proper drivers on the same rig. UT is especially optimized for windows.

Anyway, you and Edward can go ahead and defend linux, Gnome whatever all you want. I'll continue to laugh.

yes bloated but it works
by Rodney McDonell on Thu 10th Jun 2004 10:27 UTC

If you ask me, linux is more prone to bloat as an application developer has a wide range of API's to help him do what he wants to do. Also, there is no one desktop to produce appliations for. So, when a good application is created it can use interfaces such as ncurses, qt, gtk or custom. Lets look at my installation. Apps i used most are

firefox (custom interface + gtk)
thunderbird (as above)
dselect (ncurses),
xmms (gtk)
mplayer (gtk),
centericq (gtk)
xpdf (gtk)
blender (custom?)
worker (custom)

You can see that in windows or MacosX all versions of these applications where created using standard API's for those platforms, but there is not a standard for linux yet and there may never be. There are so many choices, but then thats what makes it so great.

I recently purchased a new machine and i dont really like the bloat of gnome (most things can be done with a good shell) but i knew firefox, thunderbird and games i'd be able to play now with a good gfx card may take up a bit of RAM. So i bought 1Gig. And im very happy ;)

BTW - I didnt RTFA

RE: @Brad Griffith
by Brad Griffith on Thu 10th Jun 2004 10:30 UTC

Oh, so you've been basing this off the experience you've had on your box alone? What kind of graphics card do you have? I don't play commercial games, but I've done some OpenGL development. I'm chiming in because you seem to be arrogantly applying your personal experience to disparage other people on the forum - and you always seem to leave out actual helpful info like your hardware/software during these "tests." Just thought I'd double check to see if you had any basis for your rudeness.

Finding Leaner Faster Alternatives
by Rick on Thu 10th Jun 2004 10:41 UTC

It takes about 10 seconds to start OpenOffice and Mozilla on either of my computers. I prefer the leaner faster Textmaker word processor instead. It is full featured and starts up in only about a second. The same company also sells the Planmaker spreadsheet but, neither are not free. For my browser, I used Mozilla Firefox instead of Mozilla because it is much leaner and faster. Firefox does not include an email program so I use a seperate e-mail program.

Slackware 9.1 it recommends 64 MB of RAM for X-Windows. That is not bad but, Slackware does come with a wide choice of kernels, window managers and other options. I wonder if they expect the user to make lightweight choices when using more minimal hardware? So anyway, Slack would not the best choice for a newbie unless perhaps a more experienced user installed it for them.

Vector Linux and Gentoo Linux probably also have more minimal hardware requirements but I have never used either. What should we recommend for a newbie with an older computer? I hope that being bloated is not a problem with all distros and all Linux applications? The Linux community needs to find or create a good leaner faster distro for those who need it.

@ Lumbergh & By kaiwai
by Edward on Thu 10th Jun 2004 10:43 UTC

Drivers are whatever the latest reference VIA , SBLive, and nVidia drivers were of about a month ago. DirectX is also the latest (9.0b IIRC). If those are 'screwed up' then I have even less respect for Windows than I did before.

Given that people on report similar results to me (Linux faster than Windows for native games with Q3A/UTx engines) almost all the time, I'm going to guess that it may be the Mandrake/Redhat/SuSE etc have default configurations that aren't receptive to playing games. Given that I run Debian, and most people who report such values are running Debian or Slackware (with a handful of FreeBSD and clueful Gentoo users as well), this wouldn't suprise me.

kaiwai: The SiS driver 'works', but it's never going to be great. Unless SiS changed their tune very recently - they don't and won't release info to write drivers. From what I've gathered, it's a miracle of reverse engineering that you can use even vaugely accelerated X with SiS chipsets at all. Sorry. See following link for more info;

Re: Re @Brad Griffith
by phonetic on Thu 10th Jun 2004 10:44 UTC

Brad, I'm with you on this one.

Personal experience on my side: 3d games run about 8-13 fps faster in linux than windows.

Of course I'm using a 2.6 series kernel with preempting, plenty of swap, and 512mb of ram. All my HDD's are on Ultra DMA 100 and have the correct cables too. Card is a geforce 2 gts pro. Soundcard is emu10k/Sb Live 5.1

TMK the only major 3d game that was developed in windows, is UT/Unreal which was developed with VC++, Quake I, II and III all developed under linux and ported to windows. I beliebe UT2k4 was also developed under linux and ported to windows. BF1942 was developed in windows, ported to linux I think.

But anyway, what defines the speed of the game generally is not whether you play in windows or linux, but whether you have good drivers, a decent supported card and a good motherboard. It's all about bottlenecks and streamlining.

IME Linux memory management is a lot more efficient, and the memory caching runs faster. I wouldnt recommend running 3d apps with less than 512mb of ram on either linux or windows, due to textures and sound files.

Lemmingburgh seems to me to be a simple, run of the mill, troll.

no one will actually read this but
by Peragrin on Thu 10th Jun 2004 10:44 UTC

Mandrake 10.0 is slow, though not unbearably. I have a 550mhz with 128 megs of ram. I can run kde 3.2.2, with a dozen apps open, and several servers. Granted it can take a minute to load Open Office, but it is very usable.

when i ran Mandrake 10 it did seem slow. My solution? use the latest knoppix to due a hard drive install of debian. configure, and make sure synaptic is installed as well. Very easy to do.

@Brad Griffith
by Lumbergh on Thu 10th Jun 2004 10:46 UTC

It's not just me, it's about everybody else out there that plays games except for the fanboys that think that linux is the holy grail. But since you don't play commercial games, you're the expert on how well they play on windows vs. linux, so by all means chime in again.

But hey, you're the graphics expert with your opengl experience. Maybe you should be working on that non-existant X server that needs to catch up with the Gnome guys work.

Or better yet, work on the replacement for the disaster that is Bonobo.

I'm sure Edward appreciate you sticking up for him though. Fanboys must stick together.

Re: Re @Brad Griffith
by phonetic on Thu 10th Jun 2004 10:48 UTC

Oh forgot to mention that im using NVidia's binary driver adn GLX modules, not the distro supplied ones.

1gb swap partition.

I run both windows XP and Linux (2.6 + KDE etc) on many PCs:
PII 266MHz 256MB
Crusoe 800MHz 128MB
Via 600MHz 256MB
AthlonXP 2500+ 512MB
P4 3GHz 256MB
Linux is pretty slow on lower end PCs, but it is not painful: i know exactly that it takes X seconds to start app Y, and Z seconds to boot, I can live with that. Windows can be very responsive, but also extremely slow, for reasons that are totally unknown to me. (It's not fragmenting nor an antivirus, nor fonts)
Also, with Linux, as you get better hardware, you get better performance proportionally. I cannot figure out why, but Windows does get slow even on high-end PCs.
Also, you can easily use linux as a server in text mode on anything.

RE:@Brad Griffith
by Brad Griffith on Thu 10th Jun 2004 10:50 UTC

Man, all I asked for was some numbers and info about what hardware and drivers you're using. Two linux users have provided that info now, which makes them seem a lot more credible to me. We still don't know what version of Linux or Windows you're using either. Just calm down a bit and maybe we can figure out why you're getting different results than these other people. Bonobo seems a little off-topic at this point, but so you know - the XServer work is well under way and a lot of good fixes are already in CVS. Driver support does suck for some graphics cards right now in Linux - the best drivers coming from NVidia, in my opinion. Things are improving though. Just calm down a bit.

..well, do something about it instead..
by Jinx on Thu 10th Jun 2004 10:51 UTC

*sigh* Some ppl in here complain about programs being poorly written. How easy do they think optimizing is? Whereas I can understand what they're trying to say, it's just not like that. If ppl haven't tried programming to some extent the word 'optimizing' is more or less analogous with someone just magically pushing a button "Optimize program" and it's superfast and uses -5MB ram.

It's just not like that.. as jbmadsen pointed out in #60 (or around) he hit the problem 110% correctly. Today we use higher level languages to build applications faster. We do all the exciting parts of adding functionality, being creative and innovative and leave all the more mundane jobs to the compiler/IDE. So productiveness naturally suffer from efficiency. I know one(!) guy who has a fairly decent understanding of assemblylevel coding. And he can indeed write small efficient programs.. The sad thing is that he just might end up taking 10, or maybe 100x as much time to write what another can write using a highler lvl language.

You also have to remember most ppl writing these apps do it for _fun_. Not for you, your grandmother or your dog. Try and tell a guy he is writing bad code and then inform him he should optimize it so it runs better, when the code has been written by him mostly for himself and then released to the public, so _you_ can use it freely.

Simply, if u don't like where linux/gnu/apps are going, write your own damn code, optimize existing or use DOS. There is no stopping evolution in this; it'll progress like it has always done.. so either join the ride, do something actively about it (instead of just b*tching) or use/do something else.

@ lemmingburgh
by phonetic on Thu 10th Jun 2004 10:55 UTC

Hmm, are you talking abotu games that have been ported to linux, or just ones that are win9x/xp based, and have to be run under winex or the like?

See, I know starcraft runs brilliantly under linux using winex, wine or the starcraft fork of the wine codebase. I also know there is no problem whatsoever with any of the UT/UT2k4/Quake I-II-III etc games (they run fine for me, as previously expressed, including between about 7-13 fps faster).

I have to boot windows XP to play SIMS though, as it uses soley windows API's and DLLs that arent quite complete in wine/winex yet. I bet windows doesnt play tuxracer real well though.

Technically, the Linux kernel memory management and scheduling seems to be superior to that of windows xp, but then, its also a lot newer too.

by Lumbergh on Thu 10th Jun 2004 10:57 UTC

Quake I, II and III all developed under linux and ported to windows. I beliebe UT2k4 was also developed under linux and ported to windows

Well since you are clueless about how these games were developed I guess we should definetely believe you to be credible on your performance claims.

Both Carmack and Sweeney both use VC++ and have so for a long time. Since at least quake II for Caramack who has never developed on linux and since forever for Sweeney. They were not "ported" to windows from linux. In fact, porting to linux is always a money loser and the only reason the quake or the unreal series were ported was for the good karma.

You fanboys can keep up your linux myths though. I'll continue to laugh.

by Lumbergh on Thu 10th Jun 2004 11:00 UTC

I bet windows doesnt play tuxracer real well though.

Haha, TuxRacer, the jewel of linux gaming. Good one.

by garapheane on Thu 10th Jun 2004 11:04 UTC

This is from my own machine. Perceive it as how you'd like it to. The Windows installations was tweaked moderately for performance.

Machine 1:
P4 2.8E (3.265GHz OC on stock HS)/512MB PC3200/Maxtor 160GB 8MB Cache/ATI Radeon 9000 Pro 128

Machine 2:
Cel 2.2E/256MB PC2700/Maxtor 120GB 8MB Cache/ATI Radeon 9000 Pro 128

Machine 3:
P3 1Ghz/512MB/40GB HDD/nVidia GeForce2Go 16MB VRAM

Boot Time (OS: Machine 1 / 2 / 3)[s]:
Boots to usable GUI (logins are skipped on WinXP)
WinXPPro(JPN): 19 / 25 / 35
WinXPPro(EN): 14 / 22 / X
BeOS5PEMax3.0: X / 16 / 17

*NIX OS Boot Time (Boot to shell + startx)
FC1: 24 + 11(GNOME/Bluecurve default) / 26 + 10(GNOME/Bluecurve default) / X
Slackware9.1: 30 + 4(fluxbox) / X / 46 + 6(fluxbox)
FreeBSD4.10: X / X / 34 + 6(fluxbox)
FreeBSD4.9: X / X / 33 + 6(fluxbox)

There. Come on flame me.

Re: ..well, do something about it instead..
by inflagranti on Thu 10th Jun 2004 11:04 UTC

Optimizing is not that hard as you claim it to be. Especially not for extremly bloatet programs.

Its hard to optimize to a certain goal, but its quite easy to optimize a program to be say about 20% faster. When you write your code without optimizing in mind you should be able to get that 20% just by rechecking your code for bottlenecks and then optimizing those. The more you optimize the code, the harder it gets to optimze it further.

So optimizing code for the first time is very easy and should result in a huge performance gain, where of course it is hard to squece more performance from already heavily optimized code.

agree wholeheartedly - it is worrying
by tech_user on Thu 10th Jun 2004 11:12 UTC

i agree wholeheartedly with the author. its is silly when the asoundingly fast and big hardware is not enough to do simple tasks such as wordprocessing, extracting tar files, without wasting resources. remember 5 years ago - how big were machines then? tiny compared to today. and the OSes ran better. WinNT4 has a high capability for its ability to runout of a feww hundred MB on disk and in 32/64MB RAM.

most software which is bundled into linux distros is bloatware. i used to be bale to just load up a light window manager and use the tools i wanted. but alas, the fat is seeping down into thelower level tools now... kernel, modified standard commands, etc etc ... this is why i prefer BSD now after many years of linux distros. you are guaranteed a base install which is fully functional and its susually within a few hundred MB on disk, if not less.

keep it small, keep its focussed, do it well. let the users buidl big things out of small tools. do not make the users try to break big tools into smaller ones.

by Seo Sanghyeon on Thu 10th Jun 2004 11:15 UTC

inflagranti, you are correct. But the point is that, usually, OPTIMIZING IS NOT FUN. I, as a free software user, have no right to dictate where hobby developers should spend their free time.

by Anonymous on Thu 10th Jun 2004 11:16 UTC

The recent release of GNUstep LiveCD uses Morphix hardware auto-detection and it contains light-weight apps that are good for low-end computers. Wmaker is pretty and fast X environment, only it's very different from Microsoft-style GUIs so one needs some time to learn it. Once you become familiar with it, the GNUstep/Debian combination is, IMO, one of the best available solutions for low-end machines -- just what this article is calling for.

My main box is a dual PIII with 384MiB of ram and SCSI disks. I bought that box in 2000 and haven't felt the need to upgrade it yet.

Before my SGI-branded Sony monitor died, I used my trusty Indy as X term to connect to my box. WindowMaker ran fine, KDE ran fine. Gnome 2.x was completely unusable. There's something in the way GTK 2.x and nautilus do their stuff that prevents it from being used comfortably over a network connection.

Now I have a 17" monitor and use my PC with a local display. I've had Gnome 2.6 on both FreeBSD and NetBSD, and finally gave up and returned to WindowMaker. It takes 1/10th of the time to load and I didn't use any gnome app anyway, so it's not a big loss. I'm running Arch Linux now since I wanted my 3d acceleration back and didn't want to compile programs (moz and OO.o for example).

GTK 1.x was fast, was incredibly fast. GTK 2.x is awfully slow. Hopefully it will be getting better, but it makes me wonder if I should shop for alternative toolkits, even though I do love the GTK+ API.

One think I'd like to comment on is why the reviewer thinks Evolution is any better than Sylpheed. I know it has the calendar thing but, for basic e-mail/news use, Sylpheed is way faster and works very well.

by Anonymous on Thu 10th Jun 2004 11:17 UTC

yup. that's the major annoyance. that's the reason why i have 1/4GB ram on 200MHz cyrix and 1/2GB ram on 733MHz via c3 and use woody almost everywhere (one remaining hevaily modified zip-slackware instalation)

This thread is <i>getting very fat</i>.
by Andrew on Thu 10th Jun 2004 11:21 UTC


Fat is bad?
by Duncan on Thu 10th Jun 2004 11:22 UTC

Yes linux distributions are getting fatter, but so is modern PC hardware. If getting fatter means taking advantage of this ever-increasing-in-power modern hardware, then does this necessarily have to be seen as a bad thing?

@Brad Griffith
by Lumbergh on Thu 10th Jun 2004 11:23 UTC

My specs are P4 3.2 ghz, 1 Gig of ram, ATI 9600 Pro.

I'm running Gentoo and XP Pro.

Now on games up to UT2k4 (quake I, II, III, UT2k3) the difference is meaningless from the perspective that the game is running fast enough so you don't care, but invariably everyone of those games runs faster on windows than linux. Especially something like UT2k3. The unreal engines have historically been "let's get the windows version done right and then maybe worry about linux/mac", where the quake engines have historically had crossplatform in mind. Mouse control has always been somewhat of a problem under X until recently without a ton of tweaking of X parameters and even then it never quite felt right.

Note, phonetic is totally wrong about what platforms all the quake or unreal engines were intiially developed on. They've never been developed on linux first. Sweeney has always developed on windows and actually Carmack did use NeXT machines at one time(I think maybe Doom and possibly Quake I), but Quake II onward he's developed using VC++.

I've played some of the older quake engines on older machines and the difference between the windows and linux version was much larger, but linux has improved in recent years.

Listen, linux can be a good gaming machine, but when I see people saying that they're getting better fps on their linux partition than their windows partition I'm going to get suspicious because it starts reeking of fanboyism.

by bitterman on Thu 10th Jun 2004 11:24 UTC

Are you sure bootup is a good measurment? Linux doesn't care about fast boot as much cause its not suppose to turn off that much, it waits for things like the network, it doesn't pass it then load it in the background while something else takes the cpu cycles.
Network usually takes about 8-10 seconds for me, sometimes more and it has next to nothing to do with cpu speed.

Windows has a fast bootup cause after a system lockup or new program installed we're told to reboot resulting in silently talking to ourselvs about how much windows sucks until we see our desktop again.

We could digress and say that fat is bad for your heart...
by Andrew on Thu 10th Jun 2004 11:26 UTC

but sticking to the subject,

1) The Linux kernel is not specially fat.

2) Some everything-but-the-kitchen-sink GNU/Linux 2004 vintage distributions could be called fat. They are designed and marketed as such.

3) Not happy with a 2004 fat Linux distrib in your 4-year old Duron with 128MB RAM? Good, stick to a slim one.

Pass on, people, nothing to see or read here...

Yes, kde/gnome are bloated *sight*
by pepe on Thu 10th Jun 2004 11:27 UTC

Windows XP does *NOT* feel good in less than 256 of ram. I've been using gnome in 256 MB of ram and it was not _that_ bad. Now, if you're executing evolution and/or openoffice....openoffice takes 60 MB of ram easily. That's half of the ram on a 128 MB box. Yes, office is *much* better and faster...and the windows window manager is really light.

And yes, gnome and kde are bloated. That's no surprise, xchat takes 12 MB of RSS usually in my box, where mirc through wine eats much less RAM. X doesn't takes too much memory IMHO. 23 M of RSS right now is not that much for the beast it is. I guess it could get better. icewm right now is eating 5 MB of RSS, less than fluxbox once you *really* start using it.

IMHO this can be fixed with a bit of tuning. It'd be worth of it to get a set of kernel sysctls tuned for desktop.

by Anonymous on Thu 10th Jun 2004 11:28 UTC

I had a friend who wanted to load Linux on an old Pentium MMX machine with 48MB RAM. Windows 98 ran perfectly content on it, but Debian with XFCE4 was too much for it. I dual boot Debian (with Gnome) and Wndows on my Athlon XP 2400+ with 448MB RAM (after 64MB is taken out for graphics) and it goes blow for blow with Windows if not performing faster than Windows, but the minimum requirements keep creeping up. It seems strange to me. It doesn't feel any bit slower than Windows on my machine. In fact, most of the time it feels faster, but on an old machine it won't even run (a machine that can capably run Windows). I just don't get it. Code that is too inefficient for an old computer should run slowly on my computer. I'm sure the answer is something like 'it uses more resources to make it faster' or something.

In the end, I have to disagree with the author. I don't think that the hobby OSs are catching up to GNU/Linux. It takes a lot more that what they have to make an OS that competes with GNU/Linux. Sure, they're fast, but they don't do nearly as much stuff as GNU/Linux does. That makes a huge difference. The question is whether they will stay this fast. More importantly maybe, are they faster than GNU/Linux was at that stage in its development?

RE:@Brad Griffith
by Brad Griffith on Thu 10th Jun 2004 11:29 UTC

Wow, what a reasonable post. Thank you. I would like to point out, though, that ATI driver support isn't that great. It's quite possible that the guys with NVidia cards could be getting better performance in Linux. Anyway, lets bury that argument because no one seems to have any kind of authoritative benchmarking - I know I've seen some around somewhere. But, well, it's 6:30AM and I haven't gone to sleep yet. I'm done with my work. This thread is out of control. Good night/morning.

XP is really slow
by Zilu on Thu 10th Jun 2004 11:29 UTC

XP soooo slow that, we'v installed it on 3 machine in out network , all p4 with 256 ram, 1st week it runs fast after that, it become really slow, even startup time is too long, i mean after it show the desptop and begin loading start menu etc, it take too long time, no need to tell how it'll be slow after applying SP or other patches.
so i just installed win2003, guess what, with few step tweaking ( performance,disabling undeeded services etc) it really run faster then XP.
but like all windows systems, i have to reboot the system after 24-48hourse.. u know why .. it will be tooo slow.

RE: Most of gnome's speed problems seem to come from GTK+
by Lumbergh on Thu 10th Jun 2004 11:35 UTC

GTK 1.x was fast, was incredibly fast. GTK 2.x is awfully slow. Hopefully it will be getting better, but it makes me wonder if I should shop for alternative toolkits, even though I do love the GTK+ API.

It's quite painful to run gtk 2.x apps on older hardware. Apologists will say that it's X's problem, but imo that's no excuse to say, "hey in 5 years X and the hardware will finally catch up with us" .

This isn't Linux getting fat !
by slack boogie on Thu 10th Jun 2004 11:35 UTC

These are RatHeadish products.
Just check requirements for RH 9.0.
Then compare them with those for Slackware ;-)

@ lemmingburgh
by phonetic on Thu 10th Jun 2004 11:35 UTC

Hmm.. just poking about in the history of quake.. You are right on one thing, and wrong on another ;)

Yes, Quake (all incarnations as far as I can tell) were ported to linux from DOS at first, but then windows. So you were right there, thanks for the correction.

However, Quake is not specifically or exclusively optimized for windows (in any of its incarnations) Carmack and Id in general develop in pure ANSI C/C++ and assembler. Carmack in particular refuses to use microsofts DirectX offerings, instead coding his own rendering routines pretty much from scratch (he's a technological purist).

Its important to note that while they are not developed as far as I can tell, on a Linux platform, Id games ARE developed FOR linux platforms.

This is a trend that is increasing, and all I can say to you, ya immature sod, is SO NER!

Yes I'm a linux fanboy, and damn proud of it. I'm proud of the motivation and principles behind FOSS, and I support it whole heartedly. Unlike the average window user, I keep myself aware of the developments and intents of the proprietary software world. It's not most proprietary software companies that worry me, just a few, unfortunately, windows fanboy, Microsoft is one of them.

Likewise, I support AMD over Intel, because I am aware of the relationship between Intel and Microsoft and the plans for the future. Things are going to get a lot worse, before they can start to get better.

That makes me an AMD fanboy too.

Sure, I'm not saying everything about Linux is better nor is everything about AMD superior to Intel. I'm a fanboy just the same. On the other hand, I hate and despise Microsoft who have done nothing for the good of anyone. Who continually and deliberately engage in dirty, and unhanded, unethical and downright immoral activity for their sole benefit and in the process harmed developers, families and nations.

Yup, I'm a Linux fanboy, and sometimes that means making sacrifices to support my principles. Don't that worry you though, at least you get to play more commercial games while supporting an evil and detestable software giant. I'm sure all the extra wasted hours soothe your troubled or nonexistent conscience just fine.

re: Fat is bad?
by Richard S on Thu 10th Jun 2004 11:41 UTC

Right, and what advantage are you talking about, then?

by pepe on Thu 10th Jun 2004 11:43 UTC

BTW - We *all* remember how *fat* (in terms of RSS ie: real ram used) went gnome2 when comparing with 1.4. I'm wondering how a debian woody default install (gnome 1.4) feels when compared with typical distros

"Just for the record, I have just booted into KDE, and started only aMSN, Firefox, konsole and kdict. Memory footprint:

774680 TOTAL
263724 USED
124584 CACHED

That's 260MB used already.

Yes, but 140 megs of that memory are not allocated to processes. Your processes occupy 121 MB of RAM.

by Lumbergh on Thu 10th Jun 2004 11:48 UTC

A couple things:

First off, Carmack does use DirectInput and DirectSound for win32. Download the quake I or quake II code and see how he just abstracts out the OS specific calls. He's always used Opengl for video hardware acceleration.

As for fanboyism. All I can say is that when I get involved in social activism it's a lot more meaningful than just software. The problem with you people is that you take software way too seriously. If you want to be an activist, be an activist in something that really counts.

Anyway, this is way off topic and it's very early on this part of the planet.

by Edward on Thu 10th Jun 2004 11:52 UTC

Brad Griffith is right. The ATi driver offerings for Linux really are complete pants when it comes to 3d. My brother has a laptop with a 'supported' chipset, and despite six months of on and off fiddling by both of us (he's just as much a 'fanboy' of Linux as me), we've completely failed to get anything even approaching decent OpenGL performance out of it. Crack-attack is playable, but that's about it.

by raver31 on Thu 10th Jun 2004 11:54 UTC

I also played all those games under xp and linux, and I disagree with you.
it was a 50/50 split as to which had the better framerate. like you said UT3 is better on windows, but what about the rest ?
and the machine I tested on had one of them ati cards... you know, the ones linux support is supposed to be rubbish with ?
but indeed, please refrain from spreading FUD unless you are prepared to back up what you say with proof.

btw - what about enemy territory ? meet me there and I will kick yer ass into next week ;)

FreeBSD is the best alternative
by rycamor on Thu 10th Jun 2004 11:56 UTC

My Dell Inspiron laptop has only 128 MB Ram, and FreeBSD 5.2.1 runs like a champ. I have the heaviest possible install of KDE 3.2.2, with the Arts sound daemon, and many services running in the background, since this is a development machine: CUPS, Apache/PHP, PostgreSQL, Sendmail, plus standard system services. And I am running with ACPI for power-down and battery notification.

Now, I will admit that I have tweaked performance a little bit, with the following startup parameters:

sysctl kern.ipc.shmmax=67108864
sysctl kern.ipc.shmall=32768

And, I have reduced the standard number of ttys from 7 to 3.

But that's it. I am running the standard base kernel, not recompiled for efficiency, and all the cool desktop stuff. Very rarely do I get a windows lock-up, and I think that is because of the buggy IBM display driver (apparently, XFree86 3.4 is supposed to fix that, but packages are not yet available).

I will say though, that Slackware properly configured ran almost as well on this laptop. I wouldn't even start to try Fedora, Mandrake, or Suse on this system, though. Nor would I even begin to run Gnome... no thanks.

Yes, I would like more RAM, and sometimes I run WindowMaker, if I want more performance for a specific task (such as Gimp), but overall, it is quite useable; I often run Mozilla + many Konsole sessions + Kate + Gaim + Xmms + other various programs concurrently with no problem at all.

Windows XP vs. Fedora Core 2 vs. Longhorn
by burntime on Thu 10th Jun 2004 11:58 UTC

I do not know if someone already posted something comparable.

Windows XP was designed and optimized in 2000/2001 for Computers from those years and even earlier. Fedora Core 2 was released 2004. The computers of 2003 and 2004 usually have more memory und faster CPUs. In my opinion it is ok for developers to optimize their software for this hardware.

I have been on a MS conference recently (shame on me as a FreeBSD and Linux user/developer owning a Powerbook with Mac OS).
They gave an introduction to Longhorn. For best performance and all details Longhorn will need an 128MB 3d accelerated graphics card - just for desktop use!
Longhorn will be released 2006 (or probably later) so it seems to be a fair assumption about the consumer hardware widely in use in 2006.

To me it is one of the important benefits of Linux/FreeBSD and open source in general that I can have an up-to-date operating system and applications any time.

Speed comes in many ways
by Marcel on Thu 10th Jun 2004 11:58 UTC

Firstly I was a bit amazed that the writer was surprised a modern day OS required at least 192MB of RAM.

256MB is default for any XP-machine, and you will need it when you run more than 2-3 apps at the same time. We order PC's with 512 now (Windowssoftware are resourcehogs too)

Quote : His box, an 600 MHz 128MB RAM system, ran Windows XP happily

I cannot quite imagine that because if you start E.G Word on a machine like that, XP starts swapping like hell on the harddrive.

What makes a machine snappy:

- Enough RAM
- speedy harddrive

A PC with an older CPU can react pretty snappy when it conatins a fast harddrive and enough RAM, A new PC, low on RAM and a slow (5400RPM) hardrive can feel like an 486.

RAM-modules and Hardrive are dirt cheap these days. Barenones aren't expensive either.

If you need performance stick to older OS's and software otherwise buy a faster box. In case of Linux you can always try one of those slimmed down distro's.

Next to that it's up to the user to decide what's fast or what's slow. Some don't even notice their machine is as slow as hell :-)

> bitterman
by garapheane on Thu 10th Jun 2004 11:59 UTC

Nah, boot-time is not everything, but is something to refer to. All systems was configured to achieve at least fast gui responses on normal uses (nothing bigger than firefox + ooo running back-to-back + alpha).

One point I should say is that FC1 with default BlueCurve/GNOME and no boot-time optimizations, takes a century to boot, and I'm not even close to be impressed by the GUI speed. My FreeBSD/Fluxbox works just fine and bloody fast on the 1GHz laptop. I'd stick to it for as long as it lasts. I really am not that sensitive to GUI consistency, as long as I don't find it that much counter-productive. And I fell in love with FreeBSD for some time now(almost 2 years).

I think that the desktop-oriented hobby OSes will make a big change in computing the near future. Free is not everything. The same applies to open source.

I knew it
by John Blink on Thu 10th Jun 2004 12:01 UTC

This is fast approaching 200 Posts.

Anyway remember this excellent OSNEWS interview.

Havoc says better profiling tools will help to speed things up.

Well KDE used valgrind, what about GNOME?

RE: Developer decisions catching up to everyone
by Kevin on Thu 10th Jun 2004 12:01 UTC

OpenOffice is a Java based application, and I'm sorry to say it but Java is *not* a good language for GUI things.
Not it isn't. If you check the sources, you'll see it is C++

not really a windows fan, but...
by garapheane on Thu 10th Jun 2004 12:08 UTC

And may I add that my WinXP machines experiences 30-40days uptime, with no performance hit, except for one thing: Explorer sometimes eats up the memory upto 100MB+. Solution is to kill explorer and restart it. I'm really proud with it, since I use it casually and for gaming(mainly Lineage2). It also runs Apache/PHP/MySQL and Mercury Mail Server in the background everytime, and I still have an average of 400MB free phys memory with those running. WinXP is stable and also flexible. I like Win2003 better, but can't afford it.

I'll just pray that BeOS will once again become feasible.

Choose a toolkit, and stick to it
by Saltwater on Thu 10th Jun 2004 12:08 UTC

Choosing a toolkit (GTK or QT) and sticking as much as you can to applications that use this toolkit helps. I'm not surprised when the author tells that the combination KDE/Mozilla/OO eats memory. They all use different toolkits, so more libraries have to be loaded in to the memory. I only occasionly use GTK apps (Gimp and Sodipodi) and try to use as much KDE (QT) applications as possible. I've got 256 mb of ram and have no performance problems whatsoever.

Don't blame Linux
by panda on Thu 10th Jun 2004 12:10 UTC

Blaming Linux is useless, because the source of the problem are the bloated DE.

I love BSD and I love Linux. Real "geeks", like myself and thousands of others can use BSD or Linux without a GUI. I think CLI is great and talk about low memory foot prints.



Take Into Consideration
by Alan on Thu 10th Jun 2004 12:13 UTC

I would ask that everyone take into consideration their demands before getting too worked up over RAM requirements.

Talk lately has for the most part been about features. Everyone wants features. They want translucent windows. They want wicked visualizations. They want desktop integration. They want hardware integration. They want little graphical thingies on the desktop that tell them how much hard disk they have left.

The developer community is, for the most part, pursuing adding these features that everyone wants. But that takes a lot of work. And it takes directing programmer focus on those tasks. So you have to take your pick, many times, between running the software on a faster computer, or having cooler features. Or of course, you can personally open up the source code and optimize it yourself.

>troy banther
Nice one! But watching movies in ASCII is no fun (well, not always ;) )

RE: take into consideration
by garapheane on Thu 10th Jun 2004 12:19 UTC

I think they should audit and optimize first, then add features. And the suggestion to have a periodic feature-freeze and optimize/cleanup is really cool.

speed is everything
by Anonymous on Thu 10th Jun 2004 12:21 UTC

Wether or not Linux is getting fat isn't the real issue here. Speedups is a good thing anyway, and needs to be a priority.
There is indeed a good reason why I stick to Fluxbox. The bloaty DE's loadingtime is longer that LOTR. Firefox is damn slow to load too. Its a shame.

RE: garapheane
by ponds on Thu 10th Jun 2004 12:22 UTC

You can watch movies in the CLI with full graphics.

snappy? BeOS!
by Julian on Thu 10th Jun 2004 12:25 UTC

Gentoo with gnome 2.6, bmp, nicotine and epiphany (10 tabs) takes 166 MB here. it runs very smooth, although there's no comparison to BeOS.
But i totally agree with the author, this is all getting way too heavy, and i will have big fun installing GNU/Linux on my brother's celeron 433 with 96 MB RAM (on an i810 motherboard, means 96 MB for the OS, graphics and sound). i installed mandrake 9 on an similar system (64M) back when it was new (mandrake 9 of course, not the computer), and it was crap slow, windows didn't redraw, some apps failed to start. you couldn't even get continuous sound playback.
I love BeOS, still using it when i only want to listen to some music or watch videos. This is what you call snappy. And if you don't believe me, do a stress test, open 10 videos at the same time, and then start moving windows around. If it wouldn't lack a decent word processor, i had already migrated all my family and friends who want that their computer "just works".
And i will have to try FreeBSD. I installed 5.1 a year ago or so, but X refused to work on it so i deleted it.

@ lemmingburgh
by phonetic on Thu 10th Jun 2004 12:25 UTC

Hmm, I'm an activist for many things, mostly while I'm not studying I'm working to raise funds for Kidsmart, a program run by the bluelight council to educate and protect young children.

If you think software is innocuous, you should think again, and learn abotu what is going on and how it will affect YOU in the not to distant future.

Anyway, most of this is moot, I gave my experiences, you called me a fanboy. You were right, still, you meant it in a derogative sense, which makes you the one in the wrong. Just like deriding Islam over Christianity, or Christianity over Islam is wrong. I'm a Christian, but everyone has to find their own path. Again this is OT, but I hate trolls like you who mosey in, start a fight, then declaim all responsibility.

FYI, I picked up that linux was the development platform from a textbook on 3d programming which actually was a waste of time. I didnt just jump to the conclusion.

Your claims that every game runs better on windows have been refuted by 3 people now, and noone I see has stood up to support you. Perhaps they run better for you, for whatever reason.

But then, I don't even know why I bother responding to you, you are so obviously a troll.

It's not the linux
by sheedee on Thu 10th Jun 2004 12:26 UTC

I think that being afraid of antoher OpenOS (like BeOS) taking the place of Linux is foolish. The fact is, that linux 2.6 kernel is VERY fast thing. What makes linux box slow, is the X and QT/GTK. If other systems like BeOS were to be faster, they would have to use their own GUI and develop their own GUI apps. What we need now, is to make X faster, and then take a closer look at QT and GTK.

But I think, that this has already began, kernel 2.6 is MUCH faster than 2.4 and so is KDE 3.2 over KDE 3.1, so I guess we're on a good way here.

out to lunch
by LG on Thu 10th Jun 2004 12:27 UTC

"Typically, open source hackers, being interested in tech, have very powerful boxes; as a result, they never experience their apps running on moderate systems."

bullllllllsh*t...without any hard numbers my guess would be the exact OPPOSITE. When I read blogs of developers and they mention system specs they are often between 600mhz to a 1gighz. Why? Because there are very few games that push someone to upgrade to the bleeding edge.

Oh and complaints that XP works on a system but not are out to lunch my friend...XP is what...3 or 4 years old? FC2 is weeks old.

Get a grip.

by garapheane on Thu 10th Jun 2004 12:29 UTC

How about this. Have a performance preference from the OS level that should be referenced by apps to optimize for the specific hardware. ie. have numbers like, 1 being sub-100MHz pentiums, 2 being <250MHz p2, 3: <600MHz p2 ... and these numbers is considered to set the default settings for apps(OS/DE/heavy-duty apps etc) . Then we will have a common argument ground to compare or have a reasonable performance by default.

Pardon my English.

by Lennart Fridén on Thu 10th Jun 2004 12:33 UTC

Finally someone graps the issues at hand. I've been saying and thinking what this article talks about for years now and it's refreshing to see that there are other sober people out there.

not many problems here...
by Jared on Thu 10th Jun 2004 12:33 UTC

If anything, usage seems to be going down for some things.

My PowerComputing PowerBase PPC 603e Mac Clone with 48mb of ram could run KDE 1.1 very well. KDE 2.0 was dog slow. KDE 3.2 is also slow, but the applications are usable.

by Thom Holwerda on Thu 10th Jun 2004 12:37 UTC

Wow, I was just out for a couple of hours to University... I left when there were like 12 comments, and now I just got back and its 190 :S.

But anyways, I think the author has a very valid point. Of course Linux runs great on older systems, but hey, I got a snappier interface from my Windows Longhorn Build 4074... And that's alpha stuff! It's just a damn shame KDE/Gnome eat resources like hell. And of course I could install Xfce (love it, really) but all the major distro's are Gnome and/or KDE based, and they all want the latest stuff.

Speed is no problem on my main machine. It is a problem on my laptop though (PII 366 w/ 64 MBRAM). I am forced to use Windows ME (*evil laughter*, as y'all know I ain't Anti-MS, but ME was just, well, a joke, a bad joke), because MDK just won't run on it. And since its "just" my laptop, I ain't gonna put hours and hours of tweaking into it.

Off-topic: anyone can recommend me a Xfce based distro that would perform snappy on this laptop?

Good, accurate article.
by scott on Thu 10th Jun 2004 12:42 UTC

Good, accurate article.

right point
by WhispSil on Thu 10th Jun 2004 12:50 UTC

I think your article is excellent and try to open up some minds.

One of the propaganda that there is out there is that linux run on very very old hardware. But all of the distro more buisness-centric are huge resource eater. Red-hat, Suse, mdk, sun's jds, ...

if open-source doesnt produce more eficient code, it is save to say that this code is more secure than windows?

v 200!
by garapheane on Thu 10th Jun 2004 12:54 UTC
Red Hat 3.0.3
by subhas on Thu 10th Jun 2004 12:57 UTC

I remember comfortably running Red Hat 3.0.3 (kernel 1.2.13) with X windows (fvwm) and using LaTex to write papers.

Yelp rules!
by ShaunM fanboi on Thu 10th Jun 2004 12:57 UTC

Yelp is so much faster in Gnome 2.6 ShaunM must be a god!

OT: Games
by Riddic on Thu 10th Jun 2004 12:58 UTC

The games I've played so far, which have a Windows and Linux port so far are Quake3, UT, UT2k3, Heroes 3 and Tribes2.
To be honest, especially in UT2k3 and Tribes2 I get not only better framerates, but also when I'm playing over a network (T2 singleplayer is a joke anyways), usually I have a much more stable and smooth game, connection wise. With Q3 and UT, the framerates in Win and Linux have been pretty much equal.

I wouldn't go so far as to say that Linux ports are generally better, but saying that they are worse is bollox.
I'm hoping for more native Linux ports in the future, but only Doom3 and HL2 seem to go that way :/

You should have helped him recompile
by Alan on Thu 10th Jun 2004 12:58 UTC

The beauty of Linux is there are many ways to improve the speed. The default install of any distro will install a generic x86 kernel. If the author really wanted to help his friend, he should have helped him learn how to recompile the kernel for his specific platform. I have an old AMD K2 running Debain unstable with KDE. Ever since I recompiled the kernel for that specific processor, it has performed perfectly.

Its true...
by Christopher X on Thu 10th Jun 2004 12:58 UTC

when I first started using Linux, around 1998 with Red Hat 5.2, its system requirements were way lower then Windows NT or 9x. It was very comfortable on my Pentium 133 mhz with 48 megs of RAM, while either Windows was lesser so. What the hell happened? GTK? Gnome and KDE going feature-crazy at the expense of RAM? Its nuts...

by The Daemon on Thu 10th Jun 2004 12:59 UTC

Why don't you send Linux to /dev/null and install FreeBSD instead? Stop crying about Linux getting fat. If don't like the way things are going, switch for something else and shut up instead of waisting time writing about how Linux got fat.

by Anonymous on Thu 10th Jun 2004 13:09 UTC

I used to use Mandrake 9.1 on my old Pentium III 450Mhz with 64Mb RAM. It FLEW on it. It even supported 3d acceleration and I had a lot of fun with it. Heres some hints.

Disable any unecessary services.
Don't install both KDE and GNOME. Choose one
Make sure you enabled DMA on your drive
Don't use Nautilus on GNOME versions before 2.6, it sucked in speed before then.
Get 1.1.1, the 1.0 version was extremley slow.

I now have a Atlhon XP 2000 with 768 Mb ram. It absoloutley flies. It even outperfoms my brothers 3Ghz PC running XP.

I still run SuSE 9.0 on my Duron 800mhz laptop with 128Mb. A hell faster than Windows XP on it, plus it fits nicely in a 6Gb partition that was somehow "convienently" left empty by SONY.

P.S. Look at the requirements for the longhorn betas, you will cry!

Strip down linux
by Eddie on Thu 10th Jun 2004 13:10 UTC

Who says you need 256MB or more to run Linux? Unlike Windows XP or Windows 2000, you can remove parts of Linux and install whatever you want.

Recently I built up both RH 9 and FC1 with bare minimum settings on a Pentium III 850MHZ machine with 384MB RAM and on an Athlon 2000XP+ with 256MB ram. Stripped it down to the essentials, upgraded the core components, then installed only what I needed or compiled it from source code.

I do agree that Linux is getting more and more disk intensive but you can still control it.

windows xp vs. any linux distro
by Ben Weaver on Thu 10th Jun 2004 13:13 UTC

my mother needed a machine to check email, chat with friends, and read her old corel wordperfect files. i bought her a "new" p2 350mhz with 128mb RAM, 4GB HD, and a 4MB Rage Pro card for $10. She already had a 15 in monitor from her old computer. after reformatting the hard drive i installed xp and the latest wordperfect. threw some extra tweaks in and it runs great! i mean i was suprised. now i could never had done that with a linux distro and expect her to know how to do anything. i wonder why everybody else is having all this "disk thrashing" they are talking about with old systems and xp.

by Seo Sanghyeon on Thu 10th Jun 2004 13:13 UTC

>Don't use Nautilus on GNOME versions before 2.6, it sucked in speed before then.

This is again a good point. pre-2.6 Nautilus did MIME sniffing instead of looking at file extensions. I am sure you could turn off MIME sniffing in pre-2.6 Nautilus though... But it was not the default.

The point is:
by Anthony on Thu 10th Jun 2004 13:15 UTC

This article is speaking from the newcomer's point of view. When you look at Windows XP, are you thinking of Internet Explorer? Likewise with Linux, A newcomer's point of view is the "Total Package" even when it's clearly not. To adopt GNOME/KDE, U must adopt Linux in some way shape or form. So in actuality, Linux IS getting FAT. I love the Bells and Whistles of KDE and Gnome, But let's face it, Those WMs are a Pure Hog. And for Linux to make it in the Desktop realm, there really have to be work done on X, and The Big 2 WMs. Or... Have an Indie come out with a brand new WM that embraces all of those features and tehn some, but not being such a Pig. Or just break out De Ole Serial Port and have some good console luvin.

This article was well written, well paced, relatively typo free and balanced. I am greatly pleased by the article in both form and content. I myself have felt that Linux was slower than it should be (as is Mozilla) and I fully agree with and support every thing this author said. I notice that there are over 200 comments posted already so I assume there is an amazing amount of religious argument going on. I will have to check out the comments when I have time (I hope it isn't a waste of time - I sometimes find myself wondering why I bother).

I hope a great number of developers seriously consider this article's points because they are valid. Not just valid: right on target.

This thread is also getting fat
by fatty on Thu 10th Jun 2004 13:23 UTC

I bet this will be the first Osnews story ever reach 1000 commnets.

Are you telling me...
by mario on Thu 10th Jun 2004 13:27 UTC

...that all those swearing by Fedora Core have 256 MB of RAM? This makes me feel out of the loop. The older Linux distros work fine for me on the desktop (that's what FC is aiming at, apparently). There are also tons of drivers for these older distros. I see no reason for FC to be taken seriously for a corporate desktop - it placed itself out of the market by such steep requirements.

I wrote a rant...
by Punk Walrus on Thu 10th Jun 2004 13:29 UTC

This made me so frustrated, because Linux != OS. Linux is the kernel, like kernel32.dll is to Windows.

The FC2 *Distribution* is slow, yes, I agree. Especially on older hardware. But that's because it loads GNOME and a ton of other things in the background. So I wrote this short counter-article:

When did a 128MB/256MB become standard for you?
by hksdu on Thu 10th Jun 2004 13:31 UTC

I know many of you have upgraded your hardware to 128MB or 256MB, and a minimum 500MHz CPU, but wait, let's review the points that this article points out. The author means there are still tons of people having their 200MHz with 64MB boxes staying home and cannot upgrade no more because it is not worth to upgrade, so what's the alternatives of os because windows are not working very efficiently on those boxes even Win98 with modern apps. How can you satisfy with running 4 apps only with 128MB and an os without anything included? When did this RAM-hog revolution started? 4 apps and 128MB? Even linux, I am not saying linux is bad, like someone in the posts saying that it's the apps problems, not linux itself. IMO, he's so right because I was running Debian with 2.4.22 kernel with KDE 3.0 on a Duron 600MHz with 128MB, without KDE, if I run ICEWM, and something else, it's lightning fast, if I run KDE 2.1, it is still quite fast, in terms of fast, I mean I run KDE 2.1 with xmms, amsn, gaim, mozilla 1.5, openoffice and 3 or 4 Kate running for debugging my codes. Seriously, if I have a 1GHz CPU with 128MB RAM, I would expect to run maximum 4 apps smoothly in XP Pro, but at least 10 apps in Linux without problems.

Someone pointed it out already but I would like to mention again. After I installed Debian, it was booting fast enough, it doesn't boot anything that I don't really need, but SuSE 9 and Mandrake 9.2, they are loading tons of background that I don't really need them. And they are booting very very slowly comparing to my Win2k box. This is ridiculuos. I really appreciate the performance of Linux, I wish I can contribute to the improvement, wait for me guys, but it may take more than 50 years...haha

The desktops do a lot
by Bryan Feeney on Thu 10th Jun 2004 13:33 UTC

Featurewise, the Linux desktop (particularly under KDE) does more than the Windows desktop. I'm thinking of io-slaves, advanced theming, better security, and greater scope for user-scripting (DCOP/DBus). This takes more clock-cycles and more memory.

Since 2.2 KDE has been using less and less resources. This is from developers, who know a lot more than how to run "top". Qt4 will be speedier still. I don't know about Gnome, so I won't waste time with uninformed conjecture.

X isn't actually as big and bad as many people like to think, particularly with regard to memory, and once the clean-up gathers momentum, it's going to be even less of an issue. However a lot of toolkits don't use X in an optimal way (there was an article on OSNews about this a while ago).

I remember using KDE 0.99 on SuSE 5.2 in college back in '98, the only decent browser for Linux was Netscape 4.7 (or possible it was 4.65 back then) and it took up about 30MB. So don't get started on Firefox's usage, particularly given that it's a dramatically better browser.

The fact is people wanted more features from the Linux desktop, and the devs delivered. It quite probably takes up more resources than strictly necessary, but not significantly so in my opinion. Windows XP may not be as much of a resource hog, but frankly the bits and pieces we hear about the Windows OS family doesn't give me a lot of faith in the development process that led to that performance.

Barring the lack of configuration tools, with a bit of tweaking (e.g. creating a KDE desktop shortcut to "system:/" called "My Computer"), Linux offers an extraordinarily nice desktop experience which betters XP's in several respects. Things such as multiple desktops, easy mime-type editing, tabbed browsing, spellchecking in web-forms, popup blocking, secure worry free email, document previews in file-managers, printing to PDF/fax/email, easy inline compression and encryption and the IO-Slave (or VFS) architecture are all things that add value to the user.

As for the need to cut down, the devs already know this, and are working on it. In the last year, KDE has reduced it's resource usage, a ton of work has been done on desktop responsiveness in the kernel, and Qt has announced performance enhancements for Qt4, not to mention lots more. If the article were a comment it would be marked as flame-bait. Instead we get as sub-slashdot run of whinging and trolling.

Linux is getting fat?
by two cents on Thu 10th Jun 2004 13:38 UTC

Wait a second, since when is one distribution considered to be all of Linux? Isn't Fedora the distro being reviewed here? Why in the world would someone make an all encompassing statement like that?

Why don't you take a look at Gentoo or Debian or Slackware. I personally use Gentoo and I know that it runs circles around Fedora in terms of speed.

So please, lets remember that Linux is still nothing but a kernel. If Fedora folks decide to fill it with bloated software, then lets acknowledge that instead of saying that Linux is getting fat!

Its only going to get worse
by troll on Thu 10th Jun 2004 13:42 UTC

Looking at the upcoming Debian Sarge on the mailing lists, the following requirements are needed

Processor : 2Ghz
RAM : 512MB
Hard disk. 5 Gb min install, 25 Gb full install
Comes on either 13 CDs (all mandatory), plus 30 "Extra" cds or 7 DVDs. They even plan to sell 250Gb hard drives with it pre-installed to lighten the load.

I also cringe at the upcoming GNOME 3 that will be due in 2005. This is their answer to long horn. It will need at least 768Mb (1.25Gb to run comfortably and up to 3 to run at full potetial). It also requires mandatory 64-bit processors with at least 2500Mhz. Thats only currently avalible on highlyl overclocked opterons or the new G5!

For Debian and GNOME. It is going to be huge! I hope someone knocks some sense into those idiots.

FC2 and Gentoo
by Steve on Thu 10th Jun 2004 13:43 UTC

I'm here to tell you my brothers FC2 install on a 1800+ and 512 MB ram with some cheap mobo and slow hard drive is faster than my Gentoo install on a 2200+ with nice hard drive and expensive mobo... this is because i cant prelink gentoo correctly i believe... but his is still faster.

I can't even read the comments...
by Joe on Thu 10th Jun 2004 13:43 UTC

It's all so ignorant, between blaming "Linux" or "X"...Has nothing to do with either. Run "top" and see what is using the memory. KDE and/or GNOME are the "problem". Sort by memory usage with a capital m in top to see this.

QT and KDE with KDE/QT 4 are getting between 15 and 25 percent smaller memory footprints, so if you are too stingy to upgrade and you want to use KDE, then just wait. Otherwise, use Windowmaker, Fluxbox, IceWM, Blackbox, etc. etc. because Gnome/KDE are not shrinking anytime soon.

Besides, this "Special Contributor" is just plain wrong.

Repeat after me
by Mr. Banned on Thu 10th Jun 2004 13:45 UTC

1. Memory is cheap. It might be slightly more than at this time last year, but go back 5 years to 1999, and consider the difference. I honestly don't understand why anyone runs any OS on less than 512 MB of Ram. 512 is the sweet spot wherein your PC starts happily pushing and pulling bits out of memory, instead of having to spin the HD up to hit virtual memory all the time. A gig or higher is cooler yet for us power users (My x86 box is at 1GB, and as soon as I have the $$, I'll be at 3GB, My Mac will be up to 2.5GB here ASAP as well).

2. Most modern programmers are lazy, and have never learned, nor had to learn due to the nature of modern hardware, how to wring the last few cycles our of a CPU.

Back in the days of the Amiga (No, not this swan song called OS4, but the real Amiga), and early x86 days, it was every programmers duty to get as much as possible out of the hardware. This meant that they had to know the hardware as well as the software. This isn't true anymore (unfortunately).

Todays programmers learn to program using established IDE's, and they do so using modern, "make it easy" type languages. Most programmers couldn't handle assembly if they needed to, nor do they know the ins and outs of the CPU and associated hardware.

Thus code is optimized as much as their compiler knows how to do so. They don't go "outside the box", so to speak.

To be fair, the advancement of PC technology has been so brisk these last 10 years, it'd be hard for a lot of people to keep up with the changes in technology. The comodization(sp?) of todays programmers is largely a result of taking all these advancements, and leaving it up to a select few to understand them. The rest will simply use the development tools that these select few code for the world.

The long and short of this is that todays code expects to be run on a modern PC, loaded with a fair amount of ram. I've always argued that 128mb is too little for anything over Win98, and people have always been amazed at the increases in speed, as well as the overall increase in stability that 512MB or more brings to the table. Anything less is simply crippling the performance of your PC.

3. Linux is not an island. All of the above also applied to Linux. Sure, Linux enjoys a higher percentage of "geeks" that do know the hardware well, but the majority of your applications are being put together by those same "cookie cutter" programmers. The fact that they've chosen Linux as their platform of choice is good, but along with the genericism of the programmer comes bloat and higher PC requirements.

I agree that there are many instances where a program or OS is simply too bloated, even when considering the above, but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic.

Welcome to the 21st century friend! By a halg gig of memory for your friends 600MHZ box for about $50-60, and you'll be amazed at the difference!
Instead of complaining about it

What can happen now . . .
by Art on Thu 10th Jun 2004 13:46 UTC

It's a shame that so many of you think this article is a troll, some very good points are raised. The fact of the matter is that modern Linux desktop environments are slow as hell. I love Gnome, but I wouldn't even concider running it on anything less than a P4 with 256 megs of ram.

However, there is hope. The beauty of Linux is choice. Fluxbox and XFCE4 do a great job of doing a lot with few resources.

There are three things that can happen now.

Outcome 1: KDE and Gnome can keep adding more features and bloat with little to no consideration of performance. Mono or maybe Java will be integrated into these DEs to make rapid development possible with a tremendous performance hit. People who don't need all the fluff will switch to a distribution which offers an alternative desktop environments as the default. Gnome and KDE will lose popularity.

Outcome 2: KDE and Gnome will reach a "feature plateau" where it is comparable with Longhorn and then buckle down and do some serious optimizing. With Novell backing Gnome and trying to replace Windows in a corporate setting, this is looking more and more likely.

Outcome 3: People get fed up with this "Linux thing" and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.

Although you might not be able to tell from this post, I love Linux. My 1 GHz laptop with 384 megs of RAM runs Gentoo and Gnome 2.6 great with all the bells and whistles. My server, an old 400 MHz PII with 192 megs of RAM runs Gentoo with Fluxbox adequately.

by jmm on Thu 10th Jun 2004 13:47 UTC

Then "don't do that" (load the latest and greatest distro). _Fall back_ a few versions - damn - even RH7.3 is not a disaster...can't be worse than MS security-wise.

I agree, technically, that the well-known GNU/Linux _distros_ (I consider the kernel a separate component of the whole system) are cramming more and more software into their offerings and the leading desktop _environments_ (GNOME, KDE) 'seem' slow to me no matter what hardware I run them on. I can say, after many years of using and admin'ing PCs and networks that _speed does count_. Users appreicate _far more_ that things work fast and are "usable" (which includes app response times) than whether the machine is i486, i586, P4, etc., or even what version (OS, Office) of this-or-that is running.

Keep in mind that "slow" and "fast" are very subjective experiences to the user. Only hard benchmarking (would we all agree that apps that start up with subsecond responses to be "fast" ?...what, exactly, is your "fast"...?) I've worked in terminals for years (VAXen, HP3000, *NIX, IBM360, etc.) - to me, ANYTHING that responds in greater than 3 seconds is "slow"...but that's my _subjective_ (collective) experience. I will not wait more than two seconds for an app to respond. I use the GNU/Linux Distro/WM combo that gets me the _response_ times I want. I will tweak sometimes, but try to avoid it.

I've tended to stick with Slackware as it seems easy to install and is flexible. Swaret for updates works well. The "all packages" installation for Slack only recently moved onto a second CD (for GNOME and KDE). To me it's still the best "tightest, all around, runs on anything" distro. There are other good ones though: Vector, College, Arch, DamnSmall, etc. Then, there's always the BSD 'family' of OSs...

My current main machine: Dell Latitude Xpi, 133Mhz; 40Meg RAM, Slackware 9.1, Fluxbox _and_ wireless networking (!) via Lynksys (orinoco) PCMICA card. Xterms load in 1 second (as many as I can fire off), mutt loads in 1 second, gvim starts up in two seconds. I've done NO tweaking to this unit - it was 'install and go'. I web browse with Lynx (starts in a second). Mozilla and XMMS actually WILL run on here once started, but do not respond to my liking overall to my liking...but again, that's me). I can't get DOS or W98 to respond like this on this LT or the apps aren't there, so for me GNU/Linux is truly a "step up" and is using this machine to it's fullest.

My take is that we _stop worrying_ about _converting_ desktop users from Windows to Linux - do you think for a minute that Linus cares or is worried about it ? - go after the other _billion(s)_ (of) people who don't have a machine at all. You don't switch a friend or relative to free software OS's to prove what IT 'chops' you have, you do it because (and ONLY when !) it's easier, faster, less expensive, etc. (i.e. the "right" reasons). If GNU/Linux is not a _step up_ from MS software then there's no reason to move to it - isn't that, ultimately, what you're getting at ?

RE: Anon
by Tom Nook on Thu 10th Jun 2004 13:47 UTC

"Why don't you take a look at Gentoo or Debian or Slackware. I personally use Gentoo and I know that it runs circles around Fedora in terms of speed."

If you actually read the article, you'll see the writer discuss that. Gentoo, Debian or Slackware are NOT suitable for newcomers. We may be able to use and tweak them, but newcomers are going to get a bad impression from the "friendly" and ultra-bloated distros.

RE: jmm
by Tom Nook on Thu 10th Jun 2004 13:50 UTC

"Then "don't do that" (load the latest and greatest distro). _Fall back_ a few versions - damn - even RH7.3 is not a disaster...can't be worse than MS security-wise."

It is worse though. Red Hat 7.3 is unsupported (the as-yet unproven community legacy projects aside). Meanwhile, Win2k is still supported.

You can buy an OS from Microsoft that only requires 64MB: Win2k. It'll be supported for a while yet. With the friendly desktop distros, you can't do that; you install SUSE/Mandrake/Fedora and it requires more resources, and won't be supported as long.

This is a problem facing Linux ;)

by th on Thu 10th Jun 2004 13:51 UTC

i don't understand why someone expects distribution released in 2004 work with hardware of 1999?

run os from 1999 and you will be fine. stupid "article"

an excelent article
by Dean on Thu 10th Jun 2004 13:53 UTC

quite long from the day i have used *nix as my entertainment, here we ve read a good writing like that. and almostly it s true.
thanks for the warning of obesity of linux codes. Keep the core of programs as classic as possible.

RE: Mr Banned
by Tom Nook on Thu 10th Jun 2004 13:55 UTC

"but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic."

No it isn't! You think everyone in every country can just buy sheds of RAM like that? You think laptop users aren't restricted in what RAM they can add? You think businesses want to keep buying more RAM for 100,000 boxes to run this Linux thing, which we've advocated as better than Microsoft?

The author made a good point: there are millions of 32 and 64MB boxes in companies around the world. Linux should be providing them with an opportunity! But they can't run KDE/GNOME/OpenOffice/Moz because these apps are so bloated. A market for Linux lost.

"By a halg gig of memory for your friends 600MHZ box for about $50-60, and you'll be amazed at the difference!"

Why should I have to? Why can't programmers actually THINK about performance and elegant design? Is that too much to ask? This is the point many people on this thread are making. Chucking more and more RAM at a problem doesn't make it go away.

RE: th
by Tom Nook on Thu 10th Jun 2004 13:58 UTC

"i don't understand why someone expects distribution released in 2004 work with hardware of 1999?"

Why is that so much to ask? Videos from 2004 work on 1999 video players. Music CDs from 2004 work on 2004 CD players. Fuel from 2004 works in 1999 cars. It goes on and on and on.

Why SHOULDN'T software work on slightly older machines? Yeah, there's no need to make it cater for old 386 boxes, but telling people to upgrade every five years is terrible. And it's bad for Linux against Microsoft. Win2k works fine on 1999 machines, and it's STILL SUPPORTED. All the Linux distros made at that time are NOT supported now.

As has been said, this bloat is causing slow takeup in the corporate space. Now companies are forced to upgrade 100,000+ machines to run Linux, just like they have to with Microsoft. That's APPALLING.

re: Its only going to get worse
by Seo Sanghyeon on Thu 10th Jun 2004 13:58 UTC

troll? What an apt name for you. Debian Sarge minimul install 5 GB? Please, don't write such a blatant lie. I have GNOME, KDE, XFce, Mozilla, fully installed and it is under 2 GB here.

Yes, Sarge will be 13 CDs. But only the first CD will be mandatory. Debian has been always installble with a single CD.

Partly True
by Anmol Misra on Thu 10th Jun 2004 14:04 UTC

True, Linux is kernel and not OS. But talking about distros, Red Hat/Fedora, Suse, Mandrake all are getting bloated. I had P3 866 Machine, 128 Ram and Red Hat 8 was slow on it and 9 was almost unusuable(after turning off all crap). So was Mandrake. BSD was fast only without DE. I currently use Vmware and run Linux on it. But to my horror, most distros are slow to run on Vmware as well no matter how much mem I give to them. Only option seems running Debain/Slackware on it. Unfortunately, I have not been able to Install debian on it. Can anyone point me out how to install debian on Vmware?

gnome's weakness
by asafas on Thu 10th Jun 2004 14:05 UTC

is just the insane memory requirements. 2.6 is even worse than all before. :/

Re: BeOS
by walterbyrd on Thu 10th Jun 2004 14:05 UTC

>>BeOS wasn't cool for no good reason. BeOS can make 6 year old hardware feel fast. <<

Not my experience, not by a long shot. I still have my boxed beos 5.0 pro editon. No only did I find that it wasn't any faster than windows, the setup was absolutely awful - primitive even for it's day.

Been a long time, but as I remember, everything was centered around floppy disks. I had to make all of these floppy disks, and I had to reboot constently. Also, I think I needed windows just to install beos.

BeOS didn't feel any faster than windows 95, or windows NT 4.0, to me. When I read posts raving abpit beos, I just have to scratch my head.

by Luke McCarthy on Thu 10th Jun 2004 14:06 UTC

I guess I won't be getting invloved in this discussion. Is this the most amounts of comments on an OS News story ever?

by Luke McCarthy on Thu 10th Jun 2004 14:09 UTC

Maybe you did not have accelerated drivers for your graphics card? Makes hell of a difference.

*cough*AOL*cough* ;-)

Try NT 4.0 on a 120mhz/64mb system
by walterbyrd on Thu 10th Jun 2004 14:09 UTC

In my experience, this is very snappy, and takes way less than 100mb of hdd space to install. Add ms-office-97, and you have a very snappy system, that can do about 100% of what most office workers need to do.

Still the linux zealots carry on about how linux can you leverage your old hardware.

Cheap memory
by osnewsvisitor on Thu 10th Jun 2004 14:09 UTC

Since when did memory get cheap?

Did I miss something?

@raver31 @edward
by Lumbergh on Thu 10th Jun 2004 14:10 UTC

Sorry, but this is my first ATI offering. It's fast on linux too - none of those crappy open source drivers. I've been using Nvidia since the Riva 128 days. Riva, TNT-1,2, Geforce3. The quake series were always faster on windows.

Finally, someone with the courage to say it!
by BM on Thu 10th Jun 2004 14:12 UTC

I've been saying that for the past 2 years: Linux IS ACTUALLY very fat! At least as an average user might see it. Your article is objective addressing this point! Congrats!

As someone pointed already, it is true that Linux is about only the kernel, but that does not mean a thing when it comes to normal users. Actually, it means nothing even to advanced users, for we can do nothing with ONLY the kernel!

When I started with Linux, I runned more friendly distros, like Mandrake, Conectiva or Rad Hat. Yes, they were extremelly low and bloated! I kept asking myself why in hell would this OS come with so many text editors!!!! In Mandrake I could count 9 of them just out of the box!!!

Those kind of problems made me move to Debian, so much lighter, i wouldn't know why. I got used to hack into scripts and read logfiles. So I stick with Debian since then and my desktop is 100% Debian. But I must say it is powerful enough: even Debian is slow in low-end machines! And don't expect average people hacking stuff or wanting to learn this nerdy stuff. They just want their to open in about 2 secs, not those 5 min I have already seen!

Reread the article
by Shawn Barrick on Thu 10th Jun 2004 14:14 UTC

It doesn't matter how fast new hardware is coming out, or it spec's. It doesn't matter how cheap 256 MB of RAM is.

I've got a stack of PII's here at work after our recent rollout, and it's a pain to install recent disto's on them. I want to show my boss the value and older hardware and linux, and spending a week tweaking out a distro or buying hardware upgrades, no matter how cheap ain't going to do it.

The most absurd thing is the installers. I can run SuSe 9.0 on some of these (with a fwwm or something), but the fancy-dancy installer doesn't like less than 128 MB ram. That's just silly.

RE: What can happen now...
by Shane on Thu 10th Jun 2004 14:16 UTC

Outcome 3: People get fed up with this "Linux thing" and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.

Switching to FreeBSD won't make much difference if you are after a workstation. Your choice of window manager/desktop environment will dictate how responsive the computer feels on old hardware.

I can relate to this article. I have recently installed Fedora Core 2 + gnome and kde on a vaio notebook with 128MB of RAM. It wasn't pretty. I was hitting swap very often. Windows XP was more responsive on this configuration.

I have since got rid of FC2 and installed FreeBSD + windowmaker on the notebook. It feels much faster now, mostly because I am now using a lighter window manager.

Re: Try NT 4.0
by Luke McCarthy on Thu 10th Jun 2004 14:17 UTC

Well, I can relate to that myself. I had a quite good little system set up on NT 4 and Office--150 MHz Cyrix with 32MB RAM IIRC. It was quite usable. But the only place you can get NT from these days is "some guy" or eMule... Actually I just did a Google and found some places actually sell it, but £92 seems a bit steep for a computer worth half that.

by andy on Thu 10th Jun 2004 14:20 UTC

Outcome 3: People get fed up with this "Linux thing" and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.

What a stupid plug. You were complaining that KDE and GNOME are too bloated. They will be just as bloated if you run them on top of BSD instead of GNU/Linux. And what the heck does your server have to with this discussion?

About time we recognised it.
by Peter on Thu 10th Jun 2004 14:21 UTC

Yes, Linux (particularly the windowing systems) are now more hungry than Windows. I'm glad it's being recognised at last.

I have an old Pentium 100/128Mb which runs Windows 2000 and XP - slow, but usable for my kids. I've tried both SUSE 8/KDE3 and Redhat 8/GNOME and both are unusable.

v Lololol
by slashbot on Thu 10th Jun 2004 14:24 UTC
re: Tom Nook
by Mr. Banned on Thu 10th Jun 2004 14:24 UTC

"but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic."

No it isn't! You think everyone in every country can just buy sheds of RAM like that?

In every country, no. But I still feel it's unrealistic top expect a modern program, on a modern computer, to provide optimal performance with less. If you have to run on a low end PC with less ram, than use older software that's optimized for that architecture!

I tried to explain why in my 1st post: Todays programmers simply are not focused on optimizing code for older PC's. They are not taught this, and with new PC's costing $300-$400, there's not a lot of reasons for them to locate an old, underpowered box just to optimize their code for. Sorry, but that's the facts. I'm not saying this to upset you, just stating the way that it is.

If you must run old hardware, then use more forgiving softare. People printed books, magazine, solved problems, and generally lived their lives around such software as Win98, Photoshop 5, PageMaker 5, and so on. You can't expect the newest software to run on the oldest hardware, so why not just face it and buy software that is optimized for your system? If you need to do more than this type of software will allow, you need to buy a new PC or upgrade your current one. Programmers around the world aren't going to just change their methodology because some of us refuse, or aren't able to upgrade. Complaining about it won't change that fact, so you must either adapt, or change.

Or learn programming yourself and show us all how it's done!
It's amazing how many people will complain but won't step up and try to solve the problems for themselves.

The author made a good point: there are millions of 32 and 64MB boxes in companies around the world. Linux should be providing them with an opportunity! But they can't run KDE/GNOME/OpenOffice/Moz because these apps are so bloated. A market for Linux lost.

We disgree with this point: There is just no way that you are going to get an X display, in addition to a GUI as robust as KDE or Gnome, as well as a modern app such as Mozilla or OO to run in 64mb's or less of memory. Forget it... It's just not do-able.

I agree that a whole market of hardware may go unsupported, but if you're running 32 or 64 mb's of ram, you just have to face the facts. If you go to any major tech. school and say "I need you to start teaching all of your students how to program so that their software runs on this 200mhz PII, with 32mb of ram", you're going to get laughed out of the place!

I understand your point, but the world doesn't stop advancing just because some of us can't or choose not to advance with it. That's just being unrealistic.

Look at cars: They don't stop developing new technologies and parts just because they can't be retro-fitted to my old 1978 Old Cutlass. They expect me to upgrade to a newer car if I want such modern features as air bags, anti-skid brakes, an on-board computer and such.

I either live with the old Cutlass and it's shortcomings, repairing what breaks as it breaks, or I bite the bullet and buy a newer car.

The same goes with old PC's. There's a wealth of older, albet unsupported software out there which was used by millions of people around the world when it was considered modern. Learn to use it, or upgrade so that you can take advantage of the newer technologies that are now available.

Why should I have to? Why can't programmers actually THINK about performance and elegant design? Is that too much to ask? This is the point many people on this thread are making. Chucking more and more RAM at a problem doesn't make it go away.

You're right; Throwing more ram at it doesn't make it go away.

But doing so is the cheapest and best option you have if you refuse to upgrade.

Or see my programming item above. Learn how to program modern applications on older hardware, and perhaps you'll make a small fortune.

My guess though is that those who aren't willing to spend $50-$60 on a major ram upgrade also aren't willing to pay you enough for your optimized software to make it worth your while though. And that's yet another reason why programmers are not focused on older, out-of-date systems: There's not a lot of money to be made by doing so, when compared to selling to those who can afford that $50 memory upgrade, or that $400 computer.

by Devilotx on Thu 10th Jun 2004 14:26 UTC

This whole debate is like a bit c*ck waving contest

"oh yeah! I run Slack on a 286 with 4 megs of ram"


Linux always felt a little slow to me, but I'm running SuSE 9.1 on a 450 with 256 and it seems acceptable compared to my Windows 2003 server on a 600 with 300+ megs of ram.

you have all missed the point
by raver31 on Thu 10th Jun 2004 14:27 UTC

windows on pre 1999 hardware looked crap
but you expect the same hardware to run either kde or gnome with all bells and whistles ?

ICEWM is what you should use on hardware like this, simple as that... it will look like windows, but will run faster...

also, cut out services you do not need to run

sort yourselves out !

by incon on Thu 10th Jun 2004 14:27 UTC

NEWS FLASH: has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE's?

Users are guilty
by Ivan on Thu 10th Jun 2004 14:32 UTC

Users are guilty of bloated software because they are lazzy to learn command line and they like cosmetic features not needed.

Exemple ? I can use mv, cp or even mc to copy and transfer files in linux, but "joe" users prefer to use konqueror or nautilus, that are bloated because they are web browsers, preview files and images, etc. This is nice but it is not necessary. The same occured when Internet Explorer was merged into Windows. Windows 98 and subsequent windows are much more bloated.

Stop with this insanity of trying to make multiuse applications. A filemanager doesn't need to be also a browser and a browser doesn't need to be an email client. I say the same to Windows developers !

XP Pro
by smashIt on Thu 10th Jun 2004 14:33 UTC

My Win XP Pro needs 40mb Ram after booting.
With mIRC and IE it gets near the 64mb mark.

FC2 would be unuseabel on the laptop i have, because with 192mb it is at it's limit. And XP runs like a charm on this 600MHz/192MB pc.

by Nice on Thu 10th Jun 2004 14:36 UTC

So, seems (almost) everyone sees some kind of problem. Out of the box, on all but the very most recent and expensive hardware, it seems that no 'noob' is going to get a happy Linux Desktop experience.

What is the solution?

How to motivate people who donate their hobby efforts for free to keep an eye upon being careful with resources as well as competing with Windows and OSX apps for featuresets?

General awareness and the praise of those who write neat stuff efficiently might help?

Someone somewhere famously described perfection as 'when there is nothing left to take out'. How to arrange for developers to 'take out' and improve new stuff instead of add new stuff? Awards for those who do to be an example for others to follow?

Have to Agree
by Richi Plana on Thu 10th Jun 2004 14:36 UTC

I'm a software developer and I've been using computers as far back as the Apple ][ days and I must agree, it seems current desktop environments are getting too bloated.

I'm a staunch supporter of Gnome (as can be seen in my previous posts), but I was so dismayed to see how slow its subsystems ran (menu, starting applications) on a Celeron 600 machine with 128MB RAM and an i810 video subsystem. I remember Windows 95 and X with Windowmaker running on a P2-400 with 128MB of RAM which was snappier than that. And from the base install of the Gnome DE on FC2, what functionality was it offering that wasn't in Win95 or WM? Not much. And yet it was molasses slow on a faster machine.

The point is, even if those two DEs are different, for what they do, they should be comparable in speed. In a stripped-down configuration, Gnome is still slow compared to those 2 previously mentioned DEs.

I haven't tried KDE, but this is not about Gnome vs. KDE. It's about Gnome vs. Gnome (or KDE vs. KDE).

As a software developer, I'm very well aware of the need to balance Speed, Size and Simplicity (the 3 axis). Speed, I'm afraid, is getting ignored. Perhaps it's difficult finding ways to speed things up, but one thing that shouldn't be denied is that for all that horsepower (Celeron 600 with 128MB RAM) and relative to what current DEs do, Speed has been left on the wayside.

by brockers on Thu 10th Jun 2004 14:39 UTC

has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE's?

X is not fat/slow. If you simply want to run X you can do so on a 286 with 4m ram.

Gnome (in the 1.x days) was wonderfully fast. Many times faster then its Windows counterpart. KDE 3.x started faster than KDE 2.x and has actually gotten even faster as time has gone on. Gnome 2.x is an embarrassment. Its slow, getting slower, AND reducing features.


Well, since this discussion is going nowhere, here are my top tips for today!

Find your init script, in Arch this is /etc/rc.sysinit. Comment out:

* /sbin/ldconfig
For some strange reason some distro guys think you need to 'update shared library links' every boot. Removing this presents no problems and cuts a few seconds off boot time. Maybe you should run this manually after an upgrade or something ;-)

In your shutdown script, on Arch /etc/rc.shutdown:

* /bin/sleep
After SIGTERM and SIGKILL is sent to all processes, I have to wait 3 and then 5 seconds. Really! I'm an impatient person. I don't /care/ if some lazy dangling process gets killed in the middle of shutting down. Any applications which were accessing data important to me have already been closed anyway.

From your list of servers, cut them down! (DAEMONS in /etc/rc.conf on Arch) This is of special interest to commercial distro users, such as Mandrake, Red Hat, SuSE, which normally come with quite a few useless servers enabled by default. Nobody I know needs cron on a desktop, or inetd/xinetd or any internet server for that matter (maybe sshd?). For example, I have only lisa and kdm.

Keep a copy of your init scripts just in case an update clobbers them.

Another thing: If you know how to compile and install your own kernel, grab the latest from, configure with only the drivers you need, and compiled-in rather than modules. Now you can comment out the module loading code too (and 'updating module dependecies' /sbin/depmod). On Arch remove the kernel26 and kernel24 packages so that your custom kernel doesn't get overwritten on update!

Still though, this doesn't make KDE or GNOME any faster :-(

RE: Mr Banned
by Tom Nook on Thu 10th Jun 2004 14:40 UTC

Good post Mr Banned, even though we do disagree. I can see your point though. However, I think you'd be spot on if the bloated apps really did NEED all the resources they consume. That'd be fine. I don't have a problem with resource-hungry apps when they're necessary - eg heavy-duty scientific work and games. They make use of resources for a reason.

However, gconfd taking up 12 megs (resident) is just sloppy programming. GNOME needing 128M to run comfortably is equally bad. It's possible to create integrated, smooth and friendly desktops in a 10th of that RAM, and the features and 'productivity' GNOME provides doesn't match it.

I can understand an office suite needing 64M. I can understand a browser using 32 (with lots of tabs/windows open). I can understand the desktop using 16 or possibly 24. But in the case of these apps, they're all using a lot more for no real gain. If GNOME used the 128M it needs with some incredible stuff, that'd be worth it, but it's barely any more advanced than, say, Win2k's desktop, and yet needs more.

That's what I'm getting at. Munching resources is fine when necessary, but it's bad when it's down to lazy coding.

X and the apps
by Lars Clausen on Thu 10th Jun 2004 14:57 UTC

It's not Linux or the distributions as such. The applications and/or the underlying libraries (X, KDE, Gnome etc) are what eats memory. I have on a 1/3 GB machine the following top ten memory users:

61372 52M 21756 galeon-bin
39648 32M 15092 evolution
109M 27M 5160 X
17052 15M 9080 rhn-applet-gui
12148 11M 1372 mdmpd
12964 10M 3796 emacs
15760 10M 8624 nautilus
11488 9M 8328 gnome-panel
8764 7776 6652 gnome-session
7908 7124 6460 gkb-applet-2

Galeon: 52M for a browser? Where's all that memory gone?
Evolution: It's a mail client, is it caching every mail I read?
X: Here's the biggie. Why does X take over a hundred MB? I have a simple theme, 5 windows open, 1280x1024... is it buffering like a madman or leaking like a sieve? I do not know.

The rest are more reasonable, as they share most of their memory (probably GTK/Gnome libs), but the top three are serious offenders. Even so, the machine feels pretty snappy, except when I've done huge file reads and everything is swapped out.

But what's up with X, Galeon and Evolution?


The point was made.
by Anthony on Thu 10th Jun 2004 14:57 UTC

"Linux is not getting fat. Fedora, or any other distro with those requirements are. Keep the word Linux in context with the kernel and we are a lot less troubled. If you choose to run KDE/GNOME2 and then add GDM, and all the bells and whistles (gdesklets for example)... expect to use some ram up."

"Fedora has some steep requirements, and suddenly the "Linux platform is getting fat?"

As a desktop OS, Linux needs an easy-to-use interface. GNOME/KDE are those interfaces. The point is, campared to the competition's OS interfaces, Linux's suck. You can try to twist it how you want. But that is the truth. Linux is nothing (for the desktop) without GNOME/KDE. And GNOME/KDE are way too slow compared to Explorer or OS X.(Period)

RE: Fat?
by Lumbergh on Thu 10th Jun 2004 14:59 UTC

NEWS FLASH: has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE's?

Nope, and never will be thanks to so many bloated layers that comprise the gui of almost all unixes.

brockers is pretty much right though, KDE got the architecture right and so has been able to concentrate on optimizations, while Gnome stumbles around, seemingly always having potential, but continues to be slow and never knows in what direction it wants to go.

Of course KDE is not without its problems too. The interface needs trimming(much easier than coming up with a whole new component technology for Gnome), and it relies on QT which is a decent toolkit, but because of licensing issues will always be a non-starter for many people.

Things on the linux desktop could've been so much better today if certain historical events hadn't happened, e.g. QT was a community project, Gnome had never been started, we had one unified desktop for Linux. Oh well, you people got your "choice".

by Joe on Thu 10th Jun 2004 15:01 UTC

How about Outcome 4: You're a foolish boy?
KDE and Gnome will keep getting bigger, get used to it. But they will still be skinnier than Windows.

"Outcome 1: KDE and Gnome can keep adding more features and bloat with little to no consideration of performance. Mono or maybe Java will be integrated into these DEs to make rapid development possible with a tremendous performance hit. People who don't need all the fluff will switch to a distribution which offers an alternative desktop environments as the default. Gnome and KDE will lose popularity.

Outcome 2: KDE and Gnome will reach a "feature plateau" where it is comparable with Longhorn and then buckle down and do some serious optimizing. With Novell backing Gnome and trying to replace Windows in a corporate setting, this is looking more and more likely.

Outcome 3: People get fed up with this "Linux thing" and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server. "

Buy more Ram
by Hooper on Thu 10th Jun 2004 15:01 UTC

There has always been the argument of whether throwing more hardware at the issues is a correct way to handle the situations. Personally as an avid linux user with years of tech experience, I tend to not like the idea of throwing more hardware at a problem but find myself doing just that most of the time. This slackware machine I type on runs x-kde. It's super fast. But then again I through 2Gigs of quality ram at it, added a 3+ GHZ P4 and recompiled the kernel specifically for the hardware. HTT, Highmem, etc..

While I find that it's true that the coding is getting a bit messy and bloated in certain areas, I also find that Linux in general will run fine on the requirements mentioned minus X and glamour packages. I don't believe that anyone has mentioned that Linux was ready to compete with MicroSoft Windows as of this date, so I'm taking it that all is a work in progress.

My conclusion is, you'll need to learn Linux to use Linux regardless of code bloat. Generic kernels in every distro are compiled with bloat. That includes slackware. A faster kernel helps speed things up quite a bit. With all the hardware requirements out there, linux distibutors have no choice but to compile generic kernels as they do.

Eye opening article for linux developers
by august9120 on Thu 10th Jun 2004 15:13 UTC

I just hope that this article of yours brings some sanity in open source developers and they understand the gravtity of this situation. I have been using linux for over two years now. But everytime i upgrade it becomes slower and slower and then my inclination towards windows increases. Good article though. Someone had to take the initiative and tell linux not to bloat.

Some questions for the author
by cowbutt on Thu 10th Jun 2004 15:15 UTC

1) Is your X server using an accelerated driver, or the framebuffer device, or even the generic vesa driver?

2) If you are using an accelerated driver, which one? Some provide more acceleration than others.

3) Are you using anti-aliased font rendering? If so, did you check to see whether your driver supports hardware acceleration of the RENDER extension?

4) Did your friend disable unnecessary background processes, or did he just do a "full" install so he didn't miss out on any goodies.

Finally, users don't want fast machines that do nothing, they want machines that perform some useful task. For years, the calls were for "usable desktop applications", tools such as xpaint, xfig, midnight commander and Lyx + latex being judged as being "unsuitable". Well, now we've got the kind of fully-featured applications that were being called for, but in order to create them _in reasonable amounts of time_, and with a reasonably high level of reliability, reusable component architectures (e.g. GTK, DCOP, Qt, etc) need to be used.

As the motto goes - "Good, fast, cheap - pick any two" (where "good" in this case means "efficient", "fast" means "available now rather than in 10 years time" and "cheap" still means low cost). The mass market appears to have decided that it likes "Cheap" and "Fast" - just like with PC hardware, in fact.

If you think there's a market for "Good" and "Fast", go right ahead and try to make some money doing it.

linux sucks ( now bigger )
by windows hater on Thu 10th Jun 2004 15:16 UTC

How come people say Linux as a kernel
and GNU/Linux as a OS ??? Now people want to
be like Richard Stallman ???

And since when people run Linux kernel alone
by itself???

Memory figures
by Luke McCarthy on Thu 10th Jun 2004 15:16 UTC

Where are you getting the memory figures from? If they include disk cache, then it is a little unfair. On Linux disk caches accumulate until the memory starts running low. It is quite alright for it to then deallocate big blobs on unaccessed disk cache to make room. I can imaging a browser may have a lot of its internet cache in memory, which is quite reasonable if the memory is not being used for anything else.

Erm, I'm not trying to apologise for big apps and libraries. It really does disturb me that kdelibs is a few hundred megabytes. I can't imaging how it is possible to write so much code.

"Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users -- the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don't pay any attention to speed, and it's giving a lot of people a very bad impression. Spend an hour or two browsing first-timer Linux forums on the Net; you'll be dismayed by the number of posts asking why it takes so long to boot, why it's slower to run, why it's always swapping. Especially when they've been told that Linux is better than Windows."

Doesn't that just contradict the original point that the article was supposed to make? Let me explain something to you...


I post this hapily from my work PC, running Slackware 9.1 and Gnome 2.6, with a P3 450 and 256 MB of RAM without any problems... This is a 5 YEAR OLD PC! 5 YEARS. Are you guys living with cavemen and their 486s? Go run frickin' BeOS which died years ago. Experience its fascinating "modern feature set" that paved the way for today's "multimedia platforms".

It's ridiculous and inane to think that a desktop in 2004 should run like Windows 95. The feature-set of Gnome greatly surpasses it.

Gnome 2.6 is no more "laggy" than Windows 200 or XP, the platform that you complainers wish for it to "emulate". Get a clue, and stop being cheapskates with hardware that is half a decade old. You should be forced to run some unusable thing like Blackbox or the x11 window manager in your Purgatory for being such fools.

If you want to troll with more idiotic OSNews articles, then be prepared to get trolled replies. I don't know anyone takes you clowns seriously. Linux isn't getting "fat" unless you consider the ridiculous Fedora Core 2 to be "Linux".

by sean on Thu 10th Jun 2004 15:24 UTC

People who used to point out the lightweight nature of Linux didn't seem to realize that it was not an inherent trait but just the current state of evolution. There is a natural progression in development. It is healthy to occasionally forget about optimization and concentrate on functionality, even if that means lower end machines are left behind. Once the functionality gains have been made a consolidation phase can begin to reduce the overhead and solidify a new baseline. We're approaching a time where such a phase should (and will) begin. Windows is further along in evolution, but over the longer term Linux and open source will win.

Total FUD
by Abraxas on Thu 10th Jun 2004 15:30 UTC

Recently, a friend of mine expressed an interest in running Linux on his machine. Sick and tired of endless spyware and viruses, he wanted a way out -- so I gave him a copy of Mandrake 10.0 Official. A couple of days later, he got back to me with the sad news I was prepared for: it's just too slow. His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower....

Now, I'm not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I'm just being realistic -- they should still run decently on hardware that's a mere three years old, like my friend's machine.

Ok so who does he think he's fooling? I have a similar spec machine and it was from 5 years ago, and it was a low end machine then. To top it all off, I run Linux on it, and it's not really that slow at all. I use WindowMaker with sylpheed, firefox, rox, nedit, mplayer, and some aterms. I could load XP on this machine if I wanted to but it would be pretty slow. Bootup takes about 5-10 seconds. The init scripts add about another 15-20 seconds and then I can login. It doesn't bother me much at all. In fact firefox and eclipse are the only things that take a long time to load. It takes firefox like 4 seconds to load, and it takes eclipse around 10 seconds.

Linux is an OS, and it is fast.
by wangxiaohu on Thu 10th Jun 2004 15:34 UTC

Redhat and Mandrake are not.

I run Gentoo + kernel 2.6.5 + GNOME 2.6 + OpenOffice 1.1.1 + Firefox 0.8 + Gaim 0.77 *at the same time* on my PII366 laptop with 192MB ram with no problem. Some time I watch DivX movie in full screen on it.

I did tweak the kernel and /etc/init.d/* to be fast and I did use ReiserFS.

But I still agree with the author that open source coders need to focus more in memory usage and cpu time than adding features.

Re: X and the apps
by Marius on Thu 10th Jun 2004 15:34 UTC

109M 27M 5160 X
X: Here's the biggie. Why does X take over a hundred MB? I have a simple theme, 5 windows open, 1280x1024... is it buffering like a madman or leaking like a sieve? I do not know.

109M is including your mapped gfxcard-memory!

RE: Abraxas
by Tom Nook on Thu 10th Jun 2004 15:39 UTC

"Ok so who does he think he's fooling? I have a similar spec machine and it was from 5 years ago, and it was a low end machine then. I use WindowMaker with sylpheed, firefox, rox, nedit, mplayer, and some aterms."

Er, did you even read the article? Those apps are NOT the solution for newcomers. None of the major desktop distros provide them as default software, and they're not as familiar and easy as the larger counterparts.

The writer stated that he knows these apps exist; however, light apps exist on any platform. He's talking about the WHOLE PACKAGE that newcomers see -- and the apps being pushed as alternatives to Windows. Newcomers don't want WMaker, nedit and Sylpheed. They put in a Fedora/SUSE/Mandrake disk and want to use the familiar and featureful apps that are provided by default.

And these apps are getting extremely slow and bloated. That was the point. You're basically saying people should go back to Windows 3.1. Why? Why shouldn't newcomers be able to just use a modern distro without it being slower than WinXP? Why should they have to change the familiar desktop and apps into lesser-known and less-featured ones just to get it running at a decent speed?

To Cut a long story short
by Paolo on Thu 10th Jun 2004 15:41 UTC

Maybe somebody already said that but although I use Linux since Mandrake 7.x I have to agree with writer.
I personally changed my old PIII 800/512MB PC last November with a brand new PIV 2.6HT /512MB RAM, same ATA 133 Disks, 7200 RPM.
Well, I was disappointed. While Win2k/WinXp gained considerably speed, Linux was not. Not RH9, nor FC1 were able to go even cose. I use many apps at a time but no swap so far.
Only these distributions are painfully SLOW. I am a user. I do not even care what a Kernel might be, although I have compiled many w/o speed improovments. In my comparisons with the tears in my eye I have to admit that windowz plays better and faster. My Brand new PC is sad. The Penguin is not running with much difference speed compared to the previous one. Maybe developers are losing control on their creature, becoming too complex.

- Bye,

RE: Lars
by Tom Nook on Thu 10th Jun 2004 15:44 UTC

"X: Here's the biggie. Why does X take over a hundred MB?"

Actually, it doesn't. That's the video card RAM being mapped. X itself is quite small; I've run XFree86 4.2 on a 486 before, and it's usable. It's the huge desktops and apps that are sucking up the RAM though, as you rightly point out.

Why not use an old distro?
by Anonymous on Thu 10th Jun 2004 15:51 UTC

There's nothing stopping users migrating from Win98 from installing RedHat 7.3. It's monumentally faster than newer Linux distros. The only sacrifices you would make would be some internationalization that has been added to newer RedHat distros and there would be a few insecure packages that you'd have to rebuild by hand.

However, these minor security concerns are dwarfed by those encountered when upgrading to 2000/XP instead of RedHat 7.3.

RE: Tom Nook
by Abraxas on Thu 10th Jun 2004 15:51 UTC

The writer stated that he knows these apps exist; however, light apps exist on any platform. He's talking about the WHOLE PACKAGE that newcomers see -- and the apps being pushed as alternatives to Windows. Newcomers don't want WMaker, nedit and Sylpheed. They put in a Fedora/SUSE/Mandrake disk and want to use the familiar and featureful apps that are provided by default.

And these apps are getting extremely slow and bloated. That was the point. You're basically saying people should go back to Windows 3.1. Why? Why shouldn't newcomers be able to just use a modern distro without it being slower than WinXP? Why should they have to change the familiar desktop and apps into lesser-known and less-featured ones just to get it running at a decent speed?


That's just ignorance on your part. WindowMaker is more functional than the XP shell in itself. It may not be as pretty but it does a hell of a lot more. On that same note, nedit is more featureful than the software included with KDE or Gnome. Rox and Sylpheed do everything you need them to do. Most people don't need Evolution, especially when they are just using it for email. Rox is fast, lightweight, and incredibly easy to use. Mplayer is a standard video player, so I don't know how you can agrue against that.

I read the arcticle, it was just stupid. How can you say, "The Linux Platform is Getting Fat" when you really mean, "Fedora with Gnome is getting Fat". Don't name articles something completely different than what the subject matter is, it's just inviting a flamewar. KDE on Gentoo with a 650Mhz processor and 128MB of RAM is perfectly usable, I know from experience. Every machine I have is 700Mhz or less and they all run Linux without a hitch. Windows was a mess on those machines. Sure they were fast out of the box, with WinME/Win98, but a few months of use made them dog slow. I don't want to reinstall an operating system because it can't even manage itself for more than a few months.

u people
by Anonymous on Thu 10th Jun 2004 15:53 UTC

always forget there are countries that buy all ur first-world old hardware ; here 128 MB / 500 mhz its still normal for offices and homesystems - plus people dont have too much time to spend learning a new operative system

LoL, that's the quality of Linux
by Anonymous on Thu 10th Jun 2004 15:53 UTC

Just see that G boy... typical Linux user... idiot that is..

by Anonymous on Thu 10th Jun 2004 15:55 UTC

LTSP turns old computers with only 32mb of ram into zippy boxes. It can run FC2, Mandrake 10. Suse 9.1 etc. How do they do it? heh, LTSP turns old boxes into thin-clients. Check it out.

Buying new hardware
by Brian on Thu 10th Jun 2004 15:55 UTC

I love people's solution to this problem. Upgrade! Buy more ram! I thought alot of the market linux was targeting were computers that used to run windows, that no longer meet the minimum requirements with companies caring about the bottom dollar, not having to upgrade their hardware being a big issues. I don't expect a third world country to "go out and buy more ram", and I think alot of the companies out there that are thinking of switching aren't looking to do so either.

by Luke McCarthy on Thu 10th Jun 2004 16:01 UTC

Does ReiserFS offer good performance as a Linux root partition vs. Ext3? I may consider converting. I heard it was good for small files, which are plentiful in the Linux world! Might keep my video partition as FAT32 thought ;-)

Fucking hell
by Luke McCarthy on Thu 10th Jun 2004 16:04 UTC

Lock this thread now. Please.

by Thom Holwerda on Thu 10th Jun 2004 16:07 UTC

I think the staff should reconsider using a subscription model to post on :S.

But anyways. I still find it hard to believe people keep bringing up the "Linux is not an OS" thing. Of course Linux ain't an OS, but for the newbie, it is! Accept that damn fact, not everyone is as educated in using PC's as we... well, as some of us are.


RE: RE: Tom Nook
by Anonymous on Thu 10th Jun 2004 16:11 UTC

"KDE on Gentoo with a 650Mhz processor and 128MB of RAM is perfectly usable, I know from experience."

Ain't that the truth. The box my father uses is a 533MHz Celeron Emachines(Yikes!) with 192 MB RAM & Voodoo3 16MB video card. KDE 3.2.2 runs quite well, with all eye candy turned off and light themes.

Slow by implementation
by Anonymous on Thu 10th Jun 2004 16:12 UTC

I don't think GNU/Linux system's Desktop Enviroments ie Gnome/KDE are slow due to programmer negligence, instead they are slow by implementation.

Gnome to render a typical application, most than likely requires Xlib, GDK, GTK, gnome, pango, and maybe glib libraries. A KDE application for a similar task, Xlib, Qt, and KDE libs.

Windows programmers, typically do not deal with this many layers of abstraction.

The X server technology is more than a decade old, hopefully freedesktop, with their new xservers and authority will cut down on these multiple layers of abstraction. With a newer cleaner Xlib design.

There was a time when my 40MB Compaq LTE5150 laptop would run RH 5.x or 6.x. I had 1.2GB disk and I could just pop the CD into the thing after booting from a floppy and fire up the install and I would have a decent Linux box ready to go. Oh, I did have the usual problems with X because it didn't auto config my Compaq display correctly, but other than that it worked.

I recently tried to load one of the more modern RH ( 8.0 ). First, I couldn't easily select a workstation install because once it selected all the default packages my disk was no longer big enough after configuring root and swap! I had 800MB of /usr space and it wasn't enough. Secind, once I trimmed down the installation and installed it, 40MB was just not enough. Hell, the X server was 38MB! So I can't use Linux on my old laptop anymore unless I revert to an old version distro or run some stripped down distro.

I think the existing distro companies are in trouble because of this. They obviously don't consider this an important aspect to their survival. I think its an opportunity for all the lean distros to get out there and provide a solution.

Lock The Thread
by Karl Abbott on Thu 10th Jun 2004 16:24 UTC

I agree -- this thread should be locked....way too many replies and way too much crap not to lock it.

Not a troll, unfortunately
by Micha&#322; Kosmulski on Thu 10th Jun 2004 16:24 UTC

Unfortunately, this article is not a troll, as some people seem to suggest. On my machine, an Athlon XP 1700+ with 256 MB RAM, KDE feels too little responsive quite often and starting OpenOffice also takes several seconds which is enough to make it feel "slow". And I'm running Slackware here. I have also done some tweaking in order to improve speed. I think it is about time that someone tried to unify all the different toolkits in order to create just one that could be used by _most_ desktop users and that all applications would switch to it eventually. I think it is a necessity in order to market "Linux" to the masses. Similarly to what several people wrote, I have quite often had trouble trying to convince people to try out Linux, after they saw how sluggish it seemed compared to windows. Also note, that places which would be really good for establishing a linux stronghold, such as schools or charities often have and use obsolete hardware - which won't run any modern Linux DE and be usable. I think improving performance is a technical necessity for linux nowadays. Even longhorn is going to have a "legacy" mode without all the bells and whistles allowing it to run on reasonable hardware. The guys at Microsoft do not want to commit suicide after all. Hope the FOSS movement doesn't either.

RE: What a silly rant.
by Matt on Thu 10th Jun 2004 16:24 UTC

I have an old Dell laptop (233 P2 with 140 RAM) running windows XP and it runs just fine if you turn off the theme manager. I can run Winamp, Office, Firefox and Thunderbird all at once without a problem.

by snowflake on Thu 10th Jun 2004 16:25 UTC

>I don't believe all what you say, not one bit. I use XP >on a 256 MB machine as well as many others on an Athlon->XP 1.3 GHz and it runs great. Either your installation is >hosed, or you blatantly lie.

>Typical response of the astroturfer. "You must be doing >something wrong." It's not Windows, it the user. Your >other choice is just downright insulting. If you can't >defend the product, attack the consumer. The system is as >described, it is properly installed, and I don't >blatantly lie.

I agree with the earlier poster though I'll be more blunt, you're a linux lier. We run Windows XP on 256MB without any issues. Have done since it came out.

by Hooper on Thu 10th Jun 2004 16:25 UTC


"I love people's solution to this problem. Upgrade! Buy more ram! I thought alot of the market linux was targeting were computers that used to run windows, that no longer meet the minimum requirements with companies caring about the bottom dollar, not having to upgrade their hardware being a big issues. I don't expect a third world country to "go out and buy more ram", and I think alot of the companies out there that are thinking of switching aren't looking to do so either."

It is an option Brian, as well as learning to use Linux. Linux is designed to use memory. Some forget this. Even with 2Gigs of Ram on a machine you will find that Linux manages to use a majority of it in one fashion or another. Especially when loading multiple large apps. The more Ram you have, the merrier with Linux.

Some of the problems people have with this stem from their knowledge of the Windows operating system. Where an over abundance of Ram in was not necessarily a good thing.

I've converted this machine to a dual boot slack/winxp pro machine to test the responsiveness of WinXP and Linux with an above average amount of memory.

Seriously I do not believe that WinXP Pro is utilizing and managing the memory as it should and I fail to see true responsiveness gains from XP due to increased ram. On the other hand, with Linux I can say that I've seen excellent gain in both responsiveness and speed with an above average amount of RAM. Therefor, yes. Purchasing more ram for use with linux (this is kernel 2.6.6) shows a valuable upgrade and performance gain while Windows XP Pro does not on the same hardware. Which makes for a viable option if the funds are there.

The notion that Linux was designed to run on specific lower quality hardware, or a machine with a lower amount of ram is not accurate. Linux is moving. It's not stagnant. Of course Linux as it progresses will use more memory. That is inevitable as it is with any software on the planet.

I'm in no way saying that you need 2Gigs of Ram to run the linux kernel. But some distro's like Fedora may like it. Most machines I have have 512M or less and do well for the most part. But there are advantages with Linux and an increased amount of ram with the increaseingly larger package sizes. This requires a kernel recompile as it would for HTT etc....

I prefer the way Linux manages memory and has a "I'll use it if you give it to me" approach over my windows eXperiences.

These are just my observations.

RE: ReiserFS @Luke McCarthy
by wangxiaohu on Thu 10th Jun 2004 16:25 UTC

Yes, ReiserFS is fast. It uses B+ Search Tree to manage files. Check this out for comparison:

My $0.02
by BeastofBurden on Thu 10th Jun 2004 16:28 UTC

Windows 2000/XP, in my experience, needs at least 256MB to
be really usable. I had to use a Win2000 machine with only
128MB of RAM, and the disk spent over half the time
thrashing while I tried to get some work done.
I finally complained to our workstation support, and
he was able to scrounge up 256MB more RAM, so now at
384MB the machine has crossed into the realm of "useful"
versus "throw out the window frustrating"

I built a WinXP box (Athlon 2000+) for my wife's
grandmother with 32MB onboard video and 256MB of main RAM.
I wish I'd sprung for 512MB, because I find this machine
spends most of its time swapping as well, but that's when
I'm using it - I tend to task switch a lot, and I spend a lot of time waiting for my new window to move from back
to front, and there are often inexplicable delays where
nothing seems to be happening.

My main home system and my work laptop run WinXP,
both ar P4 systems with 512MB of RAM, and both are
mostly responsive enough such that multitasking is
easy. I run WinXP on the laptop because I have to.
At home the WinXP system also boots Gentoo Linux using
kernel 2.6.6, and for the most part it runs KDE 3.2.x
for the benefit of my wife (I like XFCE4 better).

Apples to apples on the same machine, WinXP is somewhat more
responsive on application startup than Linux, but task
switching is more consistently responsive on Linux,
especially running kernel 2.6 versus 2.4.

My wife is perfectly happy using Linux with KDE
for her e-mail and word processing, though she
probably wishes OpenOffice started up faster.
I like some of the eye candy of KDE, but I tend
to gravitate towards XFCE4 because I'm more of a CLI type.

My other home machine is a PII 300Mhz with 288MB of RAM.
It exclusively boots Gentoo Linux with kernel 2.6. It
has KDE installed, which runs just fine, but once
again I prefer XFCE4 because of its light weight.

I tend to run shell based apps on the PII box, because my
wife is usually using the other machine, but when I need
something with a GUI like Opera or Mozilla, I tend to
ssh into the faster machine to run those apps. The PII
can run these apps just fine, I just tend to work faster
than the PII can react.

I also installed Gentoo with kernel 2.6 on my wife's
grandmother's old Compaq PII 233Mhz with about 128MB
of RAM. The system was usable, though
slow to load apps (due to _very_ slow hard drive).
If my wife's grandmother did not use AOL on dial up,
I would have saved her the $1000 it cost to build her new system and simply given her this system.

Because I have invested myself in learning how to install
Gentoo Linux, I have no need for the more newbie friendly
distros, even though I have installed them several times
to see if any can pull me away from Gentoo, but none ever
have been able to. I also keep a Knoppix 3.4 CD handy
in case of emergencies. If Gentoo didn't exist, I would
use Knoppix in a heartbeat.

The beauty of Linux is that it *can* be configured to run
on any hardware, but you have to know what you are doing.
The less people know what they are doing, the more
bloatware needs to be included to cater to their skill set.
Windows 2000/XP and the commercial distros fall into
the category of bloatware, currently.

The speed improvements in KDE 3.2 and the kernel 2.6
work that seems to be going on to address system responsiveness tell me that this is not going to be a
problem for the commercial distros for long.
Longhorn is another story.

My project
by Cecil on Thu 10th Jun 2004 16:31 UTC

I will be setting up my dad's old 200 pentium machine... 3 gig hd, 16 meg vid ati card... it has 32 meg ram. and it has 64 now, but on purpose i am trying to downgrade it. I want to make it a project box to see how to make it perform speedy. I will probably compile everything myself in gentoo. I will use twm, and only a very few basic apps. I will be exploring options that are the slimmest I can find. If anyone has ideas, please feel free to email me at The idea is to have a window manager, 2.6 kernel, sound working(awe32), office suite, and a few other things.

by Mad Echidna on Thu 10th Jun 2004 16:35 UTC

It really had to be said, this is so true! WHat ever happend to tiny apps? Rember all of those great DOS games like Descent II and Warcraft? I've seen games with similar graphics run in SDL with a horrible frame rate on brand new machines, while the DOS counterparts flew on 486s.

RE: My project
by Mike on Thu 10th Jun 2004 16:35 UTC

The apps will run on that system, but they will be intolerably laggy. Compilation alone will take about 1-2 weeks just to complete.

You might be better off focusing your energies elsewhere.

He's dead wrong.
by Chris on Thu 10th Jun 2004 16:36 UTC

To defend KDE:
I don't think I read a single review of KDE 3.2 that didn't say it felt faster than KDE 3.1. What do you want from them? To make it unbelievably fast and still provide a full desktop that even includes a sound mixer?!
You don't need 128MB of RAM. I ran (I've since bought more memory at the unbelievably low price of $15) a laptop with 64MB of RAM, 8 was shared to video. Yes, I did run floxbox instead of gnome or KDE, and something like xfce would have worked well too while being more userfriendly.I could work in abiword while browsing the net in firebird and talking on Gaim. Yes, it'd use 80MB of swap but it wasn't all that slow. It felt like a machine that was a bit short on memory, I also used it as a Win98 machine and it was certainly better off this way.
I run a PII 350 with 256MB of RAM. I must say that's plenty of RAM as the machine never seems to be low (I of course don't do graphics manipulation on it). Once again, I have extensive experience with the same machine on Win98, although it was one abused install, and it's much better now: It doesn't fill up the hard disk with temporary internet files.

You really can't complain about the memory use issue compared to XP as XP is 3 years old. Things will slowly use more and more memory, and I for one don't see it as individual apps doing it as much as it is users wanting to do more at once.
Your buddy could speed things up a lot by simply using konqueror over mozilla. Mozilla is a memory hog, I'll agree on that one. KDE sucks up a lot of memory, but it's tradeoff is that it's a complete environment that looks very nice by default. I would also hope that in the KDE setup he turned off all the fancy animations, like XP would have done for him.

I'm all about clean code, but I'm not seeing the overall problem you are. Things are going to use more and more resources, that's life. Linux can't run well on a 386 forever now can it? And 128MB of RAM was not a proper amount to install on a machine in 2000, and it's still too little today. Maybe you should blame computer distributers for being cheap on RAM? My store does it too, but I always get customers to upgrade their memory. It's important to have more than 128MB of RAM with Windows XP because it's ungodly slow with that amount of RAM. You expect a new machine to feel quick....

You are so right
by Öystein Andersen on Thu 10th Jun 2004 16:37 UTC

I'm admin for an school and I have for several years work towards linux at school. One off the arguments was longer lifespan of the computers. No need for upgrading the hardware, stable, free of charge. This year I get an "Go" from the managment and this spring we installed fedora with xfce. The result ? All the students say the same thing. "It so slooooow". And it is. Over the summer we probably fix it with an other distro, byt my students can't fix it on their home computer. Its a shame.

by tseteen on Thu 10th Jun 2004 16:38 UTC

I cannot believe "His box, an 600 MHz 128MB RAM system, ran Windows XP happily, " : )

And I think FC2 should be optimized for 586,if not 686 ;)

@Öystein Andersen
by Luke McCarthy on Thu 10th Jun 2004 16:49 UTC

...I get an "Go" from the managment and this spring we installed fedora with xfce. The result ? All the students say the same thing. "It so slooooow".

Didn't you even test the distro you were using on the computers before recommending it? Stay away from Fedora, please ;-) If you know how, knock up your own distro or modify one like Arch, Slackware, Debian, etc. What spec are the computers?

byt my students can't fix it on their home computer

That is a problem. Have your recommended that students install a certain distro at home? They might be better off sticking with Windows and give them Win32 ports of the apps they use. Or you could look around for less mainstream distros, there are a few around that work a lot better than the usual commerical dogfood.

Re: XP
by Dawnrider on Thu 10th Jun 2004 16:54 UTC

Not really. If you buy a machine that is ten year old technology, use a ten year old OS with it, like Win95. That's what was meant to run on it.

FC2, XP and other modern OSs and distros have gained more features (like image previewing in file managers, XML file formats, WYSIWYG word processing/web design, high resolution 32bit displays and large-scale image editing, network transparent file access, accessibility tools, internationalisation support, etc.). We as people want these things ("What, my OS only speaks three languages, so my children can't understand it?", "I'm visually impaired, and there aren't any tools to help me use a computer?") and we've got them in modern systems.

These things are important and worthwhile. FC2 and others have added these features, and you can't expect hardware to support every feature that comes out, perpetually, without an upgrade.

So, in short, either upgrade the hardware, if you want the features, or be satisfied with the software as you are apparently satisfied with the hardware and stick with the OS that you've got.

128MB RAM and Windows XP...good luck
by The Great Lithium on Thu 10th Jun 2004 16:54 UTC

" 600 MHz 128MB RAM system, ran Windows XP happily." That's complete bullshit and if the author had used Windows XP with that configuration then they'd know that. I've used Pentium 4s with 256MB RAM and had them crawling with a few IE (cringe) windows open and some office software going. If you believe this article then that must be a fluke because Microsoft's newest products are so amazing efficient.

It's certainly true that KDE and Gnome have gotten quite a bit bigger in recent years but Windows XP is freaking massive itself, even when it's just booted up without anything running. I'd also like to point out that my Pentium 3 at home which runs Gentoo starts up in noticeably less time than ANY machine I've seen running Windows XP (new or otherwise).

All I can tell from this article is that the author must be a Linux hobbyist at best. I've been using Linux for 5 years as well and somehow I can get a nice quick Linux setup with X going on my old P166 Thinkpad with 64MB RAM and they couldn't seem to figure out a decent setup for a 600 MHz PII/PIII with 128MB RAM. Maybe the problem is somehow Mandrake or Fedora...or maybe I'm just magical.

Maybe if so many people have these horrible performance issues that the author speaks of then they should just stick with Windows. I'll happily buy their old hardware for cheap and create more of my "magically" productive Linux boxes. Oh well, that enough of my opinions; hey, maybe I should write an editiorial too...

re: topic
by sno on Thu 10th Jun 2004 16:54 UTC

Well all im going to say is i think you should not be installing an up-to-date distro like fedora and fedora 2 on older systems (1+ year), if the hardware is older stuff then use a lighter distro or even an earlier version of the same distro, eg 300MHz with 64meg of ram would run fine on debian woody or redhat 7

my 2p

I have to agree with him
by NinjaMonkey on Thu 10th Jun 2004 16:59 UTC

I definitly agree with the article when I first started using Linux it was great because it didn't have demanding requirments and definitly made my system feel faster than Windows. As time went on this became less and less the case. At this point they almost feel the same running on identical hardware. It seems as though to me Apple has done the best job keeping system requirments low as OS X runs great even on old hardware.

Right now Linux's problem is deffinitly the GUI. Hopefully it will become less of a hog.

RE: The Great Lithium
by Tom Nook on Thu 10th Jun 2004 17:02 UTC

"That's complete bullshit"

I suggest you read all of the comments (yeah, it might take some time now!). There are LOADS of messages from people who agree with the article. Your experience may be different, but do read the comments and you'll be surprised. Many, many people have found the increasing bloat to be intolerable, and WinXP nicer on such hardware.

Hey, it sucks that so many people are unhappy, but it's a fact we have to face if Linux is going to become more popular.

The linux kernel + some basic libraries + bash runs quite happily on 486 with 8mb ram ... even with modern distros.

What might cause the slowness and bloat: are there any unnecessary daemons (apache, mysql, sshd, exim ... etc ...) installed by default with mandrake? They might eat some RAM

Also, KDE or Gnome eats a lot of memory, if you want faster response under X, switch to lighter windowmanager (I personally use Blackbox, it eats about 2MB of RAM and is really fast even on 486 with 24 mb of ram - unless I run mozilla or similar bloated application of course). On newer ('bout 500 mhz or so) PC's start of X server and blackbox is about 2 second. Starting KDE takes at leas half a minute or minute.

So linux applications are the problem, not linux and you can choose to use less memory hungry ones (use blackbox, windowmaker, fwvm ..., instead of KDE or gnome, etc ...) ... or if you don't want, just buy extra memory, it's not so expensive

by Mark on Thu 10th Jun 2004 17:06 UTC

Is there going to be threading on commenting? There is no logical reason why there should be 320 comments spread over 25 some pages. How am I supposed to reply to a comment someone made 310 comments ago, without it getting lost in the shuffle? I am not even bothering to click into 31-45 comments, I am already tired of trying to find replies to me. Let alone the issue of seeing who is actually replying to ME.

This isn't a flame, this is a rationalization of the status of the commenting here. You cannot have this many comments totally unorganized and expect anyone to benefit from debate.

Great Article
by Gabor on Thu 10th Jun 2004 17:09 UTC

This is the BEST osnews article I read this year. I agree with the author 100%. What the hell is happening. I remember my Pentium 166 running KDE fine back in 1996. Now I have a 3200+ Athlon system, and gnome terminal eating up my CPU power.

Spend hours tweaking and hacking. Oh come on, linux can be what you want if you spend hours tweaking and hacking. Just spend a week or two downloading & compiling the apps. you want. KDE/Gnome needs all that, but you can run a distro on a 200 Mhrtz 64 meg of ram system.

Linux is good for old pc's
by gunnix on Thu 10th Jun 2004 17:18 UTC

Linux is fast on old pc's , I run Debian on a 8 year old p1 166mhz. I use it for chatting, email, browsing.
I don't feel any need to use a faster pc for those things.

It just depends on using the software that is designed to be fast (doh). There's Skipstone/backarrow/dillo to browse , sylpheed(/claws) for email, irssi/xchat for irc, abiword, gnumeric, scite , ted, etc.
The only weak point is the browser. Although Skipstone/Backarrow are good enough for normal users, they are not widely available as packages.

The guy said in his article he wouldn't let his friend use fluxbox. I wonder why as it's really easy to use.
And there's always XFCE, that one is enormously easy for newbies , and fast in comparison too Gnome/KDE.

GNU/Linux is about choice, there's no one solution for everything. The author of the article doesn't understand that.

Could it be?
by iwaki on Thu 10th Jun 2004 17:18 UTC

That M$ has discovered the way to ruin linux by making their programmers develop linux applications to consume cpu/ram/hd like the M$ apps do? I think so!

It's free software, if you're that interested in its performance learn to program and make some contributions (and see how good your code is). And if it's not good enough try to buy something that's better (hey, why pay $50 for more memory when you can get the super-de-duper Windows XP for $250?!)

And for the last time, when you run top, and X appears at the top of the list, you need to subtract the amount of graphics memory you have from the amount X reports to get the amount of main memory X is using. And if you don't know what that means don't even think of quoting "top" in a message.

And if I see one more person on these forums use the word "bloat" again I'm going to heave this PC in front of me through the window.

Seriously, get a clue. Well done Eugenia on bringing in a nice big audience of sub-slashdot adolescents. Sure who cares about the quality of the site when you can get a lot of hits.

Re: linux is not bloated, linux apps are bloated
by Tripp on Thu 10th Jun 2004 17:39 UTC

"Also, KDE or Gnome eats a lot of memory, if you want faster response under X, switch to lighter windowmanager (I personally use Blackbox, it eats about 2MB of RAM and is really fast even on 486 with 24 mb of ram - unless I run mozilla or similar bloated application of course). On newer ('bout 500 mhz or so) PC's start of X server and blackbox is about 2 second. Starting KDE takes at leas half a minute or minute."

Most sensible post so far. Absolutely right. That's the beauty of open source, is you can change GUIs, you are not restricted to using Gnome or KDE. You can switch a lightweight window manager, which is something you can't do with Windows. You are stuck with what they give you. There will be no mixing and matching, and only a little tweaking. People still seem to have trouble grasping the real value of Open Source is in its versatility. If you know how, you can damn near anything with it. With closed source, you only get to do what they let you.

X Windows
by Jon Huber on Thu 10th Jun 2004 17:41 UTC

Linux command line great stuff. Linux for a server great stuff. Linux for the every day guy not such great stuff. People have a hard enough time with Windows. As far as resposivness, drop X and your problems are solved. There needs to be a ground up graphics system developed to replace X kinda like the Y window system or Aqua on MAC. One common toolkit give and one common desktop to give the industry something common to work off of.

About time someone said this.
by Jim on Thu 10th Jun 2004 17:42 UTC

Honestly, I agree with everything the author said and think that it's about time someone actually said it. Thank you.

RE: Gnome is going to get slower?
by Heinrich on Thu 10th Jun 2004 17:44 UTC

Well with all this disscussion about rewriting gnome in java or c# to make it a more devl-friendly platform, its definatly going to get a performance hit.

The problem is not Java or C# (with JIT you loose little (~10%) over C code and could get an actual improvement if the JITers start doing a better job of optimizing for the current processor; something C compilation can not do for binary distributed code -- what normal end-users need). The problem is really poorely architected code (initially or not refactored often enough) and just plain sloppy code. You can easily achieve several hundred percent improvement with well written code and carefully selected algorithm.

any coconuts here? os-9
by Anonymous on Thu 10th Jun 2004 17:52 UTC

Actually, OS-9/6809 still lives, 20+ years on. The Color Computer (and clones) still has users and an annual conference, the Nth Annual "Last" Chicago CoCoFest, where N == 11 in 2002. A group of Canadian programmers rewrote OS-9/6809 Level II for the CoCo 3 (w/ address translation circuitry) for efficiency, and to take advantage of the native mode of the Hitachi 6309. Today's serious CoCo users now typically have replaced the 68B09E in the CoCo 3 with an Hitachi 63B09E and run the rewrite, called "NitrOS9." The combination is fast. Very fast. Very very fast. Especially considering it runs on an 8-bit CPU! Observers are usually astonished, as the benefit of proper (ie, cleanly engineered) operating system design is not widely known, and certainly not widely appreciated, among users of the commercially dominant operating systems (eg, Windows and OSX versions of the MacOS).

Oh boy...
by Mephisto on Thu 10th Jun 2004 17:53 UTC

Well I am not going to wade through the 300+ comments but I did check /. which seems to basically be mirroring most of it. (Note threaded discussions are much easier to follow.)

What I want to argue is it is not the "Linux Platform" that is getting fat, it is the "Linux Desktop Platform" that is getting fat.

You can run a console based server with a very low amount of RAM and make use of it. As an example I have a firewall/bridge running a 2.6 kernel right now and free reports 48MB used, 6MB as buffer, 29MB as cache, and only 12MB active. Now I grant you it is not doing much right now but it is a baseline.

As far as GUI interfaces Gnome and KDE are both no longer targeted towards older platforms, point made. But there are other choices that are available. My personal choice is Slack running FVWM2 with a heavily customized set of menus. Yes, I am aware that FVWM is a window manager not a desktop so they are not directly comparable. But not everyone needs a desktop (thus the heavily customized menus).

If you prefer Windows, use it. Linux is not in competition with MS, it is an alternative. Both sides have their zealots but I look at it as "use what gets the job done for you." Not "use what others tell you will get the job done." There is room in the world for both MS and Linux, and I don't think either are going away any time soon.

Its just going to get worse...
by Russ Pridemore on Thu 10th Jun 2004 18:00 UTC

I mostly agree with this "rant". Footprint seems to be ignored by many in the OSS community, but I was pleased to see Mozilla & Opera pushing for less intensive implementations of a web forms upgrade. Mozilla Gecko was originally supposed to be very small and fast. They seemed to have lost sight of that goal for a while, but Firefox is an improvement for which I'm grateful. A modern, standards-compliant browser most likely will never be thought of as small or light...

I'm a longtime Linux/Gnome user. I used Redhat 4.2 through 6 on an old 200MHz Pentium with 96M memory and was mostly happy with the performance. I now run Gentoo on a Athlon XP 2100+ machine with 1.5G memory, but don't see a performance increase in proportion to the added horsepower.

What really scares me, though, is the comments from the Gnome community about switching their development language to something like C# or Java. You think Gnome performance is poor now? Just wait. I understand that many developers are more skilled in these modern languages than in C, especially given that a lot of OSS contributors are college students. But the thought running desktop a desktop written in Java makes me shudder (even though I make my living developing in Java).

by Kingston on Thu 10th Jun 2004 18:05 UTC

I've been saying that Fedora and Mandrake are bloated for a while now. You think that just because you've posted an article about it that the fanboys will pay attention now?

Good luck.

Actual speed comparissons
by twowheels on Thu 10th Jun 2004 18:07 UTC

At work I have to use Windows XP, but have Debian Unstable installed in VMWare and have VMWare running 24/7. I have many of the same applications running in both, OpenOffice, FireFox, Opera, Thunderbird, gaim, gimp, etc... I find Windows to feel QUITE PRIMITIVE compared to KDE 3.2!! KDE has a LOT of nice touches that Windows doesn't. Bloat? Maybe, but I feel much more productive with the ability to customize many more keystrokes, gestures, etc, etc. Many of those small touches make a huge difference in my user experience. I have almost all eye candy turned off in both OSes, though I have more turned on in KDE than in Windows.

That said, on this computer w/ a 3.0 GHz processor, 2 GB ram, WinXP as host and Debian unstable as guest (with 512 MB ram allocated) I started and ran each of the applications listed above, timing how long it took from clicking the shortcut until the application was stable and ready to use. In every instance I got nearly the EXACT SAME results in both OSes, even though Linux was at a distadvantage due to having 1 GB less RAM and running in a Virtual Machine. In some cases Linux was .5 second faster, and in some cases Windows, even with the same app.

Considering how much MORE eye candy KDE has, as well as those numerous "nice touches", I'd say that KDE is QUITE impressive.

Packaging is the Problem
by Charles Maier on Thu 10th Jun 2004 18:10 UTC

The Linux kernel and core utilities (GNU/Linux) is still small, fast and powerful and is getting more so with time. XFree86 (and derivatives) is a monster software suite but it has made standard desktop systems possible. Recent moves toward XML style config files, CORBA style object management, scalable vector graphic images, chrooting everything and application/menu hierarchies exceeding 10 levels iserve slow everything thing down. All this trash is I/O bound making extra memory and cpu speed a waste of money.

I think we need to distinguis between Linux the lean, mean kernel and the fat, lazy applications that sit on top of it.

Commerical groups are interested in sales not software design. Open source project are the opposite. We must leave room for both groups to co-exist but perhaps not in the same room and certainly not on the same computers.

re: New X might help
by ME on Thu 10th Jun 2004 18:12 UTC

>Now that X is being developed again things might improve,
Yeah, until the decide to intergrate the cool stuff in the server(giving transparnecy and the like..) vooom.. X uses twice as much memory. But it doesn't seem to consern anyone if you look at the mailinglists ;)

Thumbs down - if no solution is found
by Leenus on Thu 10th Jun 2004 18:18 UTC

I am not a Linux geek - I am a basic user of the platform for some fairly basic needs.

I had used Linux a couple of years back in a large scale trial of deploying used hardware in schools in India. The advantage that Linux had over Windows was that we could step up the performance of computers with older hardware. It was a very important consideration.

If it is indeed true that Linux's hardware requirements are going up to match Windows' (and there seems to be a general agreement on that), Linux is loosing an important raison d'etre in minds of people like me.

We want software that does not continuously place increasing demand on hardware - not only from an economic angle but also from the perspective of additional effort needed.

Listen guys! we need an OS that is usable, dependable, cost effective and hassle-free. If Linux cannot fulfil these needs, we will go else where, even to Microsoft.

by hallgreng on Thu 10th Jun 2004 18:32 UTC

i think the problem is that people are using the latest versions of gnome and kde on their old machines.
a PII 400MHz with 128MB RAM will run win98 and win2k fine, but winXP will be noticably slow. itll also run KDE<3.2 and gnome<2.4 (i dont use either, so i dont know what older version would be more suitable ^_^;;)

when you install windows, you dont automatically get XP and all its crappy bloat unless its an XP disc. when you install a linux distro, you get the newest (or close to) version of all this software.
you wouldnt install winXP on a P75, so why would you install KDE 3.2?
if you have a slow box and you keep it updated via apt or emerge, it will be slower after a year, simply because we live on the bleeding edge (compared to windows) of software development.

everyone wants the latest version. thats how MS gets businesses to buy WinXP even though they have Win2k throughout their office. linux software isnt getting too fat, people are just mismatching versions with their hardware.

by David Ontiveros on Thu 10th Jun 2004 18:34 UTC

Even if there are certain machine expectations for hardware, there is no excuse for writing efficient code. As open source goes, we can look at code and see what parts of it are inefficient and write the author of the software, or modify it ourselves and publish it. Also, we should appreciate the fact that a multitude of projects even exist.

Gnome is an embarrassment
by Anonymous on Thu 10th Jun 2004 18:34 UTC

It just keeps getting slower and slower. If I could, I'd go back in time to 1997 and throw Miguel into the path of a speeding bus.

Excellent Article
by Joe Papac on Thu 10th Jun 2004 18:38 UTC

I totally agree with the author. I use all three: linux, windows and OS X. Even by ditching Gnome/KDE for a light WM, you still run into the performance bottleneck with Openoffice and Mozilla. File compatibility with MS Office is necessary and Openoffice does a pretty good job with that, however it is a DOG to run. On my iBook G4 (800 MHz) Openoffice takes so long to open that I get pissed off while I'm waiting. I deleted it and use MS Office solely because it is much more responsive.

This is a Hot Article...
by David on Thu 10th Jun 2004 18:39 UTC

This pendulum swinging from one side to the other isn't getting anyone anywhere. "You have to look at this and that." "Oh, Windows takes up XXXX of resources." Everything he says is totally true - with the tools, desktops and hardware that he was trying to use.

No other general-purpose OS in existence has such high requirements. Linux is getting very fat.

Windows XP does need 256 MB if you want to get anything meaningful done and get it to run at a nice performance level.

His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower.

With Mandrake, I totally agree that he would have found it terrible with what he was using. Considering he was using GTK-using applications in Mozilla (cross-platform toolkit) and Evolution - no shock there really. And of course, we all know that Open Office is functionally good but it is an absolute dog to run. It wouldn't be so bad if it was a dog when you had many applications already open, but it isn't. It is slow to start-up whatever environment you put it in. Hopefully OO 2 will be better, because it is a big enough project to focus on getting that right.

I don't know why he didn't just use KDE applications to be honest, especially for web browsing and e-mail. They're all there, and that's what Konqueror and Kontact are there for.

If he wanted a fast desktop with pretty good hardware requirements he should have just used a 100% KDE desktop. I've ran KDE 3.1 and above on a few of P300s with 128 MB and it ran great. I'm not just talking about starting the thing up, which is how Windows requirements are calculated. I'm talking about running seven or eight applications all the time - e-mail, browsers, music etc. I think the situation has got even better for KDE 3.2.

Once you start using a lot of GTK-using applications, not just one or two, your desktop really grinds to a halt. I'm not really that surprised he found Gnome and Mozilla on Linux slow at all. I used Evolution for two and a half years prior to switching to Kontact. Features-wise it is a great application, but you could never run it and other GTK apps on a system with anything less than 256 MB of RAM. Using DDR instead of old SDRAM made a huge difference to the performance of GTK applications, whereas it made a moderate, but noticeable, difference to Windows and KDE as you might expect.

He's right and wrong. You could run NT4 or 98 in a business with 64 MB at all. I know, I've ran that amount of RAM and worked 9 - 5 with both, and that was five or siz plus years ago. Linux desktops should focus on working extremely well within the limits of 128 - 256 MB, and by that I don't mean that being the minimum requirements. KDE has proved that that is possible. You can get a pretty reasonable desktop in those limits, and it is not entirely unfeasible for companies to upgrade the memory in their desktops. In terms of eye candy there are is a lot of 3D hardware, most of it on board, that has never been used on a business desktop.

RE: Thumbs down - if no solution is found
by Mephisto on Thu 10th Jun 2004 18:41 UTC

Listen guys! we need an OS that is usable (Depending on your choice of desktop, check), dependable (Check), cost effective (Check) and hassle-free (Maybe). If Linux cannot fulfil these needs, we will go else where, even to Microsoft. (If you want to run a current Gnome or KDE on older systems and can not get the performance you want I guess you will be better off in Windows, your choice. If (big if) you have the experience in running a large scale Linux deployment and prefer it, I would argue buying more memory for the systems and running your prefered Linux distro would be more cost effective and easier on maintenance then purchasing licenses for Windows.)

by David on Thu 10th Jun 2004 18:46 UTC

You could run NT4 or 98 in a business with 64 MB at all. I know

Above should read 'couldn't'. I have enough of typing today....

Windows XP runs great on 128MB
by Anonymous on Thu 10th Jun 2004 18:48 UTC

I can totally run mspaint.exe and Solitaire at the SAME TIME with only 128MB RAM and 384MB of swap committed. As long as I don't win at Solitaire, the thing is ROCK SOLID.

by Anonymous on Thu 10th Jun 2004 18:49 UTC

If you can get it to run on 300Mhz and 128 mb, how much faster will it run on 3Ghz and 1 GB?

Legacy to blame
by nosrail on Thu 10th Jun 2004 18:50 UTC

I feel that legacy toolkits such as gtk1, fox and motif are to blame for the bloat. When you load up one these applications not only do they take up extra memory due the different libraries, they look ugly and unconsistant with the rest of your apps.

There are still a few applicatiosn using them such as xmms and gnucash. Major distros such as Fedora, Mandrake, Debian need to dump applications using these from their official sources and put the pressure on legacy applications to upgrade to gtk2.

When you are running 100% gtk2 applications in conjuction with XFCE, it is really fast and a there is lot less memory usage. The next step would be to re-write KDE in GTK, but that would probably be too much to ask than to simply not to use KDE applications.

What the heck?
by Shawn on Thu 10th Jun 2004 18:52 UTC

The author proposes we go back to the dark age of software use and development:

* Small and efficient as possible code almost always means less portable which means software can cost more

* Small and efficient as possible code often takes a lot longer and is usually harder to debug than something written quickly, with basic efficiency, and generically

* Current software development and even hardware development is centered around the idea that the programmer should not have to micormanage memory or other resources, the base operating system or hardware should take care of it (a good example of this is how C# and Java pretty much do away with a programmer being directly involved with almost all memory allocation and deallocation)

* Software can be a lot smaller "footprint" wise, if we decide to all go back to using ASCII, having no internationalization support, and pretending everyone speaks english and uses a single measurement system. Support for multiple languages, translations, localizations, etc. all increase a program's "footprint". This is a huge source of "bloat" if you look at it that way in many applications.

* The "Linux Platform" (as the author sees it) could be a lot faster in many cases if hardware manufacturers were open about their hardware specifications. There are many advanced hardware acceleration features and bits of functionality that Linux users are forced to do without because their manufacturers refuse to provide that information. This is no one's fault but he manufacturer's.

Now to be fair, I do agree that it seemed like programmers used to be able to do a lot more with a lot less. But, let's put things in perspective. I think while many programmers that started programming in Assembly Language or plain C that now use a high level language like Java, C#, Python, Perl, C++, etc. may have fond memories of lower level language efficiencies and control, but very few of them miss the headaches and the large amounts of time wasted trying to debug or write in a low level language.

The author needs to step back and realize that hardware is more powerful because users always want more, and market realities demand that software comes to market faster and faster. To go back to the days of super efficient software would require many sacrifices, time being the main one, and time is money in this modern age.

The real solution to this problem lies within changing the expectations in the market, and I don't see that happening...

Follow Firefox's example
by Eric Garland on Thu 10th Jun 2004 19:00 UTC

I've noticed it seems to be much easier to get properly architected, fully functional software to run fast than it is to get fast software to be properly architected and fully functional. Build it right first and trim it down after. The dekstop focus of Linux based software is new.
Give them time.

I'm verry impressed with Firefox. I think a lot of good work has gone into making it smaller. It would be nice to see similar work go into gnome, X, OpenOffice, and lots of other little Linux based apps that make up the desktop. It would be nice, but if it doesn't happen in a few years, it won't be an issue anymore. It would be nice if it happened. Now that they have things working rather well on the Desktop it might be a good idea to go back and spend a little time trimming things a little.

I'm very impressed with the results in Fedora Core 2 so far. I'm running it on 2 machines with 1gb and 512 mb of ram so the bloat doesn't bother me. It is a good idea to make Linux distributions work well on old hardware. I'm ok with them making it work well first and focus on the old hardware second.

I remember
by rds on Thu 10th Jun 2004 19:16 UTC

I was at a local community college, and we were using these "ancient" machines. K6-2 ~400mhz systems. They installed Redhat 9 and, as expected, they were slow. Sometimes people would click a icon and the app would show up up to 5 minutes later (mozilla, OOo). Me, I installed ROX-Desktop. Kept MetaCity because OroboROX wasn't out. Instant speed boost. Even using Mozilla and OOo was possible. Now instantanious, not by a longshot, but they loaded in "reasonable" times (~30 secs). Still not by any measure acceptable, but it was "usable" for what I was doing.

At the end of the day, the problem (with the major DEs, at least) is that GNOME and KDE are not bothering to worry about the real problems they have. They toss everything into a large, monolithic design that doesn't scale well. They don't bother trying to keep things small, because the developers use increasingly beefier machines which even then don't load fast enough to be acceptable and can justify it under the false guise of "intergration." A user expects and needs a quick-loading, responsive and easy to use DE. If they worked more on compartmentalizing the various parts of the DEs, and less on "intergrating means everything is one app," they'd have more stable and faster DEs, less to debug, and using DBus or what have you STILL have easy integration among the various parts, only now it would only be the parts the user wants and needs.

Easy fix
by Nicholas Borrego on Thu 10th Jun 2004 19:17 UTC

Don't use fedora...
I started out w/ redhat and mandrake while I tried out a few
distro's a few years back but quickly moved on to slackware.
Now I use Gentoo and Debian but I've been tryin out a few
distros, one of which was FC2 because I wanted to revisit
RPM now that I know how to actually run a linux box on my own.
The only word that came to mind when I tried apt and apt4rpm
and yum was TERRIBLE. I was impressed w/ how the system
looked w/ the bootsplash and all but as soon as you actually
start messing w/ the system you want to puke.
Linux hasn't gotten fat, it runs just fine w/ a number of
distros. If you're the type of person trying to run linux
on a really old box then use debian, arch or another
stripped down distro... If you're a newbie and all you
have is an old box then use knoppix.

by Mr. Banned on Thu 10th Jun 2004 19:37 UTC

So I've gotten tied up doing my 9 to 5 thing, and when I get a chance to check this thread again, it's been infiltrated by some retard Linux guy with his friggin' W's and G's.

I've brought this up many times in the past (and been promptly modded down by Eugenia, which makes me wonder why her and her ego aren't modding down W/G boy), but OSNews needs a login/message model if it wants to play with the big boys. You can see here the damage that a public posting system can do.

In the same sense though, if you provide a such a login method of providing feedback, you can't immediately mod down everything you disagree with, or feel isn't pertinate to a discussion. Something that will be very hard to accept for some of our moderators, I'd imagine. I'm still seeing items modded down that don't need to be, and then threads that respond to those modded down posts are left intact, causing readers to wonder what the hell it is that they're not seeing that the person replying did. Not cool or professional.

But if you go with a login type system, be prepared to be a lot more open-minded than you have in the past. If you mod down the people who've gone out of the way to become a member for some ridiculous reason, you'll lose your readers faster than you currrently are with your moderation.

On a similar note, I've often wondered just why no one's invented a method for zapping people who purposely abuse boards such as this, with like 100k volts.

Alternatively, a button which provided the rest of use with a valid home address for posters like WG boy would also be very helpful. Then, when we got the chance, those of us inclined to do so could pay said morons a visit to show off our new bats and lead pipes that we're so proud of. Now that would be the kind of internet that I want to be a part of!

re: Mad Echidna & Tiny Apps
by Mr. Banned on Thu 10th Jun 2004 19:40 UTC

I came across this site a couple of years ago. Enjoy:

everybody falling off topic?
by ben weaver on Thu 10th Jun 2004 19:47 UTC

the average user cannot just turn on gentoo or debian and be expected to use it...the average user doesn't even know how to run the automatic updates on winxp;) so what they are left with is either use xp or use one of the big three: mandrake, red hat, suse - all of which are bloated more than xp is at first boot. and like i said before, i have my mother's cpu (p2 350, 128 ram, 4gb hd) running xp just fine. she runs post-it notes, internet explorer, msn messenger with messenger plus, and wordperfect all at the same time just fine. sometimes she throws in websters dictionary and a game at on problems.

i really dont know what ppl are talking about this disk thrashing or long load times and that. a bunch of crap if you ask me. the article was very good. it obviously brought out some good discussion. companies need to develop for older systems in mind - as well as the latest and greatest -to get the biggest exposure in the market. if linux distros that are easy for the casual user to use will focus less on bloat at first boot (and allow you to turn on features you dont really need later), then maybe they can penetrate the market where longhorn will fail: the older less updated market.

quit looking at this article and thread through your own geeky eyes...not everybody can run from a command prompt. look at what's best for linux as a whole. some ppl rather spend $50 on more important things than ram or other upgrades, you know?

I hear that
by Anonymous on Thu 10th Jun 2004 19:54 UTC

I've got a 266 MHZ laptop with 192 MB ram in it that runs Win2k just fine. Current versions of linux with X, Gnome/Kde/even IceWM, and mozilla firefox as the only things running were unusable with several distributions

@ nosrail
by Anonymous on Thu 10th Jun 2004 19:54 UTC

"The next step would be to re-write KDE in GTK"

Why not rewrite GNOME in Qt? ;)

"Major distros such as Fedora, Mandrake, Debian need to dump applications using these from their official sources and put the pressure on legacy applications to upgrade to gtk2"

Wow, forced upgrades. That strategy sounds familiar.

by Gabriel Ebner on Thu 10th Jun 2004 19:54 UTC

Phew! That's the longest comments section on OSNews I've ever seen.
OSNews is turning more and more into a second Slashdot every day. Congratulations, Eugenia!

by warflyr on Thu 10th Jun 2004 19:56 UTC

linux isnt getting fat, its those shitty ass distros that are getting fat.
fedora for example has what 5 different word editors, another 10 text editors, 4 spreadsheet editors, 10 different media players, a kernel with support for nearly EVERYTHING compiled in. Fact is, linux ISNT for novice users, if you cant tell what packages to install, what to compile in and what to not compile in the kernel, YOU SHOUDLNT BE USING IT. Fedora, mandrake, etc, etc are a joke, why even use them? Ive used slackware for years, has been better than windows always, maybe not for openoffice (admit it, i havent seen anything that can beat ms office), but when playing games, it has amazing performance. I just installed gentoo couple months ago, the performance is even better! Load times on UT2004 are 1/10 of what they are in windows. Literally it takes 5 minutes to load some maps on windows, those same maps in linux, 30 seconds! Why dont you think about your "silly rant" before posting something like this, just sounds like you dont know what the hell you are doing

Re: Legacy to blame
by David on Thu 10th Jun 2004 20:01 UTC

The next step would be to re-write KDE in GTK, but that would probably be too much to ask than to simply not to use KDE applications.

I hope that was an attempt at humour.

Honestly, saying linux is getting fat because you have tried only the BIG linux distros is like saying "I broke the Internet" because your dialup connection is down. Don't pass judgement on ALL linux distros when you haven't even tried ALL linux distros. Have you EVEN tried any smaller distros? Even CDROM based distros like DSL (Damn Small Linux) or SLAX? Take a minimal PC, let's say one with under 256MB RAM......let's go even lower and say 128MB RAM.....then install VectorLinux on it.....then do your testing. Pass judgemnets based on fact....not opinion.

---Just my 2.46 cents---

by root on Thu 10th Jun 2004 20:12 UTC

Wait till they start using Mono or Java purely for desktop development.

Agree w/ author
by hotdragon on Thu 10th Jun 2004 20:15 UTC

The goal of Linux is to beat Win or OS X. First, of course, Linux should be applied on the present high-end box. Second, Linux should take the share from Win in the old boxes. I prefer a slim, beautiful and usable Linux, and so I still stay in Redhat 8.0.

gnome slow?
by simon on Thu 10th Jun 2004 20:17 UTC

my feeling about gnome is that almost each and every release have been faster than the previous. Gnome 2.4 is so much faster that previous Gnome 2. And Gnome 2 is not slow on a PowerPC 180 with 192 MB RAM. Drawing performance took a slight hit with antialiasing, some applications (such as gnome-terminal) are slow, but sorry, gnome 2 is fast. nautilus is faster between each releases. I really don't how people can be convinced that gnome is slow.

FreeBSD 4.8
by Rene Pawlitzek on Thu 10th Jun 2004 20:24 UTC

I tried RedHat 9 on my aging IBM Thinkpad 1720i (Pentium-II 266 MHz, 128MB RAM). It wasn't usuable. FreeBSD 4.8 on the other hand put new life into my box.

Actually, when that happens a grand deal of code for compatablity can be removed. In theory, just that could boost performance instantly.

Once you also consider that, as the VMs are tuned for GC, size and speed, the apps are as well, meaning you reduce requirements across the board instead of in one or two apps. With the proper profiling support in a JIT of AOT compiler, again you would see a number of benefits from one fix instead of the rather small advances that are made in fixing issues on a app-to-app basis today.

When you look at it as "ha, we'll take C and put it in a interpreted enviroment," sure it sounds slower. When you say it as "we're using a large, static library that takes care of a number of speed, performance and security issues for us," it's a much better and more honest trade off.

It's X
by bill turkle on Thu 10th Jun 2004 20:29 UTC

X windows has always been bloated and slow, it is the culprit. Let's face it: the community has to ditch this abomination and rewrite the 'graphical interface' from scratch.

It all depends on what you use
by Dmitry M. Shatrov on Thu 10th Jun 2004 20:38 UTC

Recently I've installed ASPLinux 9.2 (FC1-based distro) on a Pentium-233mmx box with 64Mb RAM. X Windows? Xfce4 feels just _good_, and all other light WM's (sawfish etc.) are _very fast_. It's understood I can't use KDE3 or GNOME2 on such a box. But look, gcc is _ok (I've compiled a kernel, though not with 8 seconds, like they say it is on a 32-processor sparc beast), I can browse the web (firefox browser), I can do just everything that people did when p233 was a hi-end dream-pc.

Now, after working a while with Gnome/KDE.. execute 'top', press ^M and look, say, on X server's virtual memory demands. As for me, it's more than impressive. As the developers, we have to do something with it.

Fedora Core 1
by simon on Thu 10th Jun 2004 20:54 UTC

I installed Fedroa Core 1 on my AMD 3000+ with 1GB RAM recently and it was slow.

This is very sad = /

v ...
by Anonymous on Thu 10th Jun 2004 20:57 UTC
RE: Linux is getting fat
by Gary on Thu 10th Jun 2004 20:57 UTC

No, your brain is getting fat!

Why give him Mandrake 10 which was produced a good two years after Windows XP was released, instead give him something born around the same time - Mandrake 8 perhaps, Red Hat 7.3, Suse 8 would have been a better option with [perhaps stepping forward for convenience] Firebird instead of Mozilla. Software grows with time and if you want a fully featured sexy "today" desktop then Mandrake 10 or Fedora is the answer - but it will cost you speed on a lower spec machine. I still have Suse 8 and Red Hat 7.3 cds sitting here and I would've given them to any pal asking me to recommend for a P600 128MB machine. I use Suse 8 on a similar spec to your friend in the office every day - works fine for me ;-)

It's all true
by Anonymous on Thu 10th Jun 2004 20:59 UTC

Linux has become bloated. Developers constantly upgrade their computers, and make the software for their own system. So software constantly scales up its bloat in tune to Moore's Law.

Speed Concerns
by root on Thu 10th Jun 2004 21:24 UTC

While I do agree that profiling, optimizing and debugging are extremely painstaking, costly and arduous, and hardly come for free, there are distros designed primarily for speed and optimization.

Today, the most popular Linux distro notorious for speed is Gentoo Linux, not Mandrake or Fedora. So if speed and optimization is a concern, you might reconsider selecting a Linux distribution designed with that goal in mind.

On the same note, I really don't know of many open source project that incorporate optimization and profiling into their development process. The trend today is to use languages that are convenient, safe, and secure at the expense of speed and system resource efficiency.

The next generation of software, unfortunately, will be extremely resource inefficient for excuses such as development time, safety and security(i.e OS X, Longhorn to mention a few). It's unfortunate, but that's reality.

The question as to which operating system is least responsive is matter of observatory perception. Microsoft and Apple spend time and money investing in ocular(visual) tricks to deceive users into thinking the system is responsive (when in reality it isn't). Throughput for throughput, Linux is a speed deamon compared to OS X or XP.

Linux and its accompanying open source projects are only begining to focus on the desktop and GUI after years of focusing on raw throuhgput, multithreading, scalability and robustness (attributes required for server operating systems).

I do agree that developers need to incorporate the art and science of optimization and profiling into their projects. But if the trends I'm observing are correct, the next generation of software will be incredibly inefficient, resource wasteful and slow!!!!!!!! even on a supercomputer.

In other words, the next generation of programers will be Python addicts whose mantra will be "Memory is cheap!" (Hence, they have every right to abuse your resources) :-) I think programs will only get slower because coders are Lazy, don't care about machines running on lower specifications than theirs( A typical programmer working for Redhat, Sun, Microsoft, Apple has at least 1GHz of Ram and 3Ghz for processing power), only care about convenience and development time and according to them, "hardware is cheap".

Contrast that with an era where Real coders had to fit a whole operating system into 4 kilobytes of RAM. Heck, even Vim (one of the lightest editors on Linux) can't run on 4 Kilobytes of RAM. Moral of my rant, times are changing, adjust and deal with it!

dont think its that bad
by SNAKY on Thu 10th Jun 2004 21:29 UTC

I have been using fedora core 2 on my server for like 2 weeks now.. and I must say.. it has about 92 Mbyte RAM and a 300 Mhz CPU but still it runs well.. and while reinstalling my normal PC I used it to play games even (nothing big though) and still it went very good
so no complaints from me..

vector linux
by linux torrent on Thu 10th Jun 2004 21:35 UTC

I just installed vector linux on p233MHz 64M ram and it is actually very usable and not slow at all.

Some people tell lies here.
by Maynard on Thu 10th Jun 2004 21:47 UTC

I am yet to see a computer with 128MB of RAM run Windows XP fine. I mean, with anything else installed. I have installed XP on a computer with that much RAM and it would take forever to load, and would not do much. I have tried it with a 300Mhz proc and 320MB RAM, and not much joy. I experienced many a coaster burning CDs there.

If usable maens that things appear on your screen and you can click on them, then XP is usable but if, like me, and the company I work for, 128 MB RAM on a 633MHz PC means they won't put Windows 2000 on it, then its another ball game too.

Everyone knows Fedora is optimised for i686, so if you are running it on a 486, then you are out of luck. It runs, or rather crawls. I am perfectly happy with it here. I hardly see and swapping, unless I am compiling something. Right now, with Evolution open, typing in Epiphany, a galculator window open, and an inkscape project open, rhythmbox playing in the background, I am using only 243 MB RAM, and 8KB of swap, basically not swapping. OK its a 2500+ XP processor, but memory usage is generally in line with Windows XP. And I am pretty eye candied to boot.

While, theoretically, there is some element of truth in your postulations, the fact still remains that I will need a lot more system resources to run programs written in VM-based/Interpreted languages.

Take Eclipse[1] for example. On a system with 512MB of RAM, Eclipse is reasonably responsive and seemless to use. However, try using Eclipse with anything less than 256MB of RAM, and the challenges of managed environments and applications become painfully apparent.

The fact is, VM-based/Interpreted applications are horrendously expensive to use, system resource wise. Practically, I doubt a desktop environment based on Mono or Java will be faster or more resource efficient than the environments we have available today.

I'm quite positive that if all the applications on my desktop were written in Java today, it will render my 1.4GHz Athlon with 256MB of RAM obsolete. I already experience problems running just one Java application at a time, talk little of if everything I had to run was written in Java, or Python, or Mono, or whatever.

[1] Eclipse is an excellent open source integrated development environment written in Java that you can install and use for free.

Goal of Linux?
by Jens on Thu 10th Jun 2004 21:47 UTC

Whats the goal of linux. Sure modern Distributions tend to request big computers but they are intended for such computers. Linux ist the possibility of choice so if you want to to use an old computer don't try to make kde/gnome responsible for this computer being slow. This was nerver the goal of kde/gnome. Try using a different window manager an stop bugging users with the same useless discussion. Linux will never be interresting for normal users if it isn't preinstalled and if there aren't new games for linux. so...
This all is basicly useless

Congrats Bob
by Kay on Thu 10th Jun 2004 21:49 UTC

Hit the nail right on the head.

Hi my name is linux and i am a hog
by rikard on Thu 10th Jun 2004 21:51 UTC

If we are actually having this discussion, we have a serious problem!
If linux (ok some distros) is actually close to rivaling a microsoft OS as performance hog, we have a serious problem!

Do the guys working on Gnome/KDE really think about performance when they code? How does the review process work?
I too feel that the bloating has gone too far. I'm not sure weather it's because of bad software architecture choices or if it's something wrong att the source level. Maybe a combination. I have seen some horrible open source code
like; using arrays instead of linked-lists when there are a lot of reorganization of elements is usually not a swell idea.
I think the bottom line is that we, coders, have been fooled into thinking that everything should be designed for portability, flexibility and reuse. This means we loose performance. (Not always but most of the time) If we're designing a module/application that have a well defined purpose. Why not just design it for that specific purpose.
I think that many people feel that their code looks so much better with a bunch of inheritance levels, dynamic run time memory allocation and on top it all, a fancy reuse pattern.

RE:It's X
by Anonymous on Thu 10th Jun 2004 21:59 UTC

"X windows has always been bloated and slow, it is the culprit. Let's face it: the community has to ditch this abomination and rewrite the 'graphical interface' from scratch."

HAHA! Spoken like somebody who has yet to run X on DirectFb. As far as KDE goes, it is there as much as it can be for now. All of the necessary parts are in place. 2.6 kernel sped up the entire system. KDE 3.2.x series offers many speed improvements. The GTK libs are being cleaned up and will eventually be faster. The only thing we are waiting on is I am very anxious to see the speed improvements as it develops.

As things stand, Linux will NEVER compete with Windows.
by ThePooBurner on Thu 10th Jun 2004 22:29 UTC

I am an XP user that has heard about hwo great linux is from his friends for years. Makes everything better, yada yada. The thing that got to me though was the whole Distro thing. I have looked into switching, but everywhere i go it says "you msut go get KDE/GNOME/WINX/An ice cream sundy in order for you to have a good experiance and be able to use anything or do anything, but our thingy doens't come wit this, so you need to go hunt down the right binary for it on some other site that we haven't linked. but make sure it is the right binaries for the compileer that we use in our distro or when compiled H4X0R5 will steal your MHz." Its a wonder you all haven't complained about it being loaded before. Its jsut about impossible for and average user to get started with linux. Not only that, but if i switched, half my games and programs would not work, at which point i would ask myself "why am i using this if i can't run anything i want to?" You all want choice and stuff and it has nded up with every Joe Friday making his own distro and making it impossible for someone to pick the one that will do everything he wants, becuse he has to see if there is a distro made that supports it all, and if not he has to learn to compile his own after getting some other limited one. The only way that Linux will EVER compete with windows is the day that you simply put the disk in the drive, it installs, and then it is ready to go with anything you want. Plug and play install, functionality in all programs and games, and speed. Lots of it. "but i want to be able to customize it" So put stuff in that allows for customizing, but don't make whole new distro's based on new user settings and crap. If KDE and GNOME are required theses days, mix them into the OS or something so that people don't hav to try and figure out why thy can't do anythig on this "better" OS. If i want a free OS that works and does want i want i can get XP of the P2P or from a friend. Untill the time that you turn it on and it jsut goes and does what it is supposed to, it will always be lacking. Crap, get the XP pro Source and then make a Secure OS from that that works better then the actual XP and we will all be set.

wtg bob!
by Rho on Thu 10th Jun 2004 23:14 UTC

I've been noticing this for years. Especially with KDE (mainly because that's what I used to use, not that anything else is any better). I loved kde 1 ...kde 2 wasn't bad.. now kde 3 is out, and it's a dog sometimes. I switched to fluxbox for a while, and just recently went back to kde, but it's beginning to get to me again and I think I hear fluxbox calling.
Seriously, apps are getting bigger and bigger and bigger. I've often wondered if software writers get kickbacks from hardware manufacturers to keep requirements up so the hardware makers can keep selling new machines.

v @Rho
by Zilu on Thu 10th Jun 2004 23:20 UTC
I thought developed meant "Refined"...
by Musashi12 on Thu 10th Jun 2004 23:50 UTC

As in, "The proggy is very developed, the algorithms are blazing fast, the GUI looks good, it's been tuned, tweaked, and chopped, so it only uses half the space it used to, and it can do twice as much as it used to!"

That used to be the mantra of a whole generation of coders. I guess that they're all dead now. My XP box isn't big or fast by todays standards, but it's the best I can do. It isn't much faster to use than the first 4.7 MHz XT I put together 20 years ago. It's a lot prettier, it does have some capabilities that just weren't available 20 years ago (due to cheap hardware, not any great advancement in programming skills).

Menuet is refined, the entire install fits on a SINGLE FLOPPY DISK, not a couple of CD's or DVD's. QNX also fits into this catagory of "refinement". OpenBEOS looks like it's trying for this also.

Linux used to be for old machines, for people that couldn't afford any better, for students, for people with limited resources. Now it looks like all the Linux authors have learned their lessons from Microsoft exceedingly well. I guess the poor will just have to drop back to using FreeDOS, since their machines won't run "up-to-date, supported" versions of Linux.

*Sarcasm* I'm sure Linus would be proud. *End Sarcasm*

Some culprits
by James on Fri 11th Jun 2004 00:08 UTC

Internationalization (non-ASCII locales).
Anti-aliased fonts.
Smooth scrolling effects.
Too-fancy themes & windows.
Unneeded monitoring etc. daemons (mdmonitor, portmap)
Journaling filesystems

In no particular order ...

But it's well-known that a lot of work has gone into supporting non-ASCII non-English environments, and one effect of that has been new bugs, distraction from optimization, and slowing down stuff like grep, sed, awk, which used to be fast. I'm sure it'll improve, but this is breaking new ground. Believe it or not, newer OSes are tackling more issues, and bigger issues.

Having said that, I'll stick with my now-ancient Mandrake 8.0 on my workstation for now. Everything feels faster ... On servers without X, the newer GNU/Linux releases are excellent.

by Bryan on Fri 11th Jun 2004 00:15 UTC

I don't agree with this article at all. My brother and I have the exact same machine, I run FC2 and he uses XP. I've have yet to have an kind of crash and rarely does anything ever slow down. I can't say the same for when I head over to my brother's computer. IE crashes left and right, the startmenu is terribly slow, he has to reboot constantly, etc. Sure Fedora might not be right for your 1995 233mhz computer lying around, but that's where Slack comes in nicely. And if Linux won't run speedily on it, I'm positive XP won't either.

accept it - X has to go
by xlynx on Fri 11th Jun 2004 00:32 UTC

We can have the features as well as the speed, but we need an environment that makes efficiency more transparent to the developers.

We need an X replacement which includes a standard and efficient toolkit, network transparency as fast as MS remote desktop, and local efficiency that's good enough to make all the embedded graphical layers redundant.

Linux not only has potential for the older PC's, but an entirely new market - that of handhelds. If the embedded machines run the same graphic layer/windowing environment as the desktops, we unleash whole new possibilities.

It is possible.

XP Embedded
by Russ on Fri 11th Jun 2004 00:36 UTC

Try Windows XP Embedded.

Customizable Module based OS just like Linux, pick what you want, and that's what you get.

Don't Want the IE binaries and dll support, or even the explorer shell? you don't pick it... put in what you want.

It's amazing how fast this is.

Setup a computer for my grandmother with 64mb ram, and 300 celeron... Total crap machine, but it runs Windows XP Embedded, with the explorer shell and IE, and Office 97 very well... I peeled out a ton of other stuff, like sound support, since it has no sound card and she just needs a basic wordproccessor/email/internet. It has support for the modem thats in there, and it has a software firewall. I built in CA Etrust Inoculateit 6 as well, in incoming only mode to enhance performance while providing decent AV.

It boots in about 17 seconds to the desktop, and all themes, etc are disabled... Word 97 launches very fast (5sec), and IE and her webmail as well. (3 sec)

I've tweaked the memory management of the registry, setup the pagefile at the beginning of the disk on it's own partition, and some other general XP tweaks.

Did I mention it uses about 350MB of hard drive space for everything? I have an old 1.2 GB caviar drive in there...

I daresay it was faster and easier and you end up with a better end result than if I had tried to go with linux...

Also I can walk her though over the phone some simple click around and change settings in a GUI, rather than explain how to type exact console commands that are case sensitive, etc.

GUIz Speed future ???
by Da Wini G on Fri 11th Jun 2004 00:48 UTC

Even with more hot features in KDE & GNOME and the GUI in general .... I see all these visual improvments as extras ... they are eye-candy ... they do not add to the productivity of the environment .... with all silly features disabled KDE future should be as fast as KDE now or KDE yesterday in an idial world or faster through development .

I agree with all the stuff in the article .... the big Linux-Desktop-on-all-computers-out-there-dream will burst quite easily if the Linux experience is slower and therefore less productive .... linux might be gettin far in markets where the kernel qualities & capabilities (my guess ... I NBee) are of importance but the main Desktops (GNOME,KDE) must be addressed soon as Win98 support has finished .. people are gonna be lookin 4 somethin cheap & fast as a replacement and I would hope 4 it 2 be a Linux distro which can please the user and might even make them feel proud to be running Linux & of course Longhorn is comin some time in the future ........ so that OpenSource software can spread to Desktops ..... like Offices and other non-geek or non-experienced .... places .
Maybe a new .org will spring up dealing with Linux GUI responsivness ...
After all this moaning .... OpenSource is a great idea and
it has all come very far ..... just hope it doesn't derail due to too much enthusiasm 4 more apps that will bombard Windoze ..... quality not quanity .... there is a big enough choice ... but now to the quality ....


How about an adaptive user environment .... automatically different setings and optimizations SPLN? on df systems .. GUI adapts to the system's abilities ?
But then I'm lazy and love suggesting stuff without doin so much myself .

by Seo Sanghyeon on Fri 11th Jun 2004 00:49 UTC

xlynx, before talking about excellency of Remote Desktop, try NX. It beats RDP flat.

by A nun, he moos on Fri 11th Jun 2004 00:54 UTC

have looked into switching, but everywhere i go it says "you msut go get KDE/GNOME/WINX/An ice cream sundy in order for you to have a good experiance and be able to use anything or do anything, but our thingy doens't come wit this, so you need to go hunt down the right binary for it on some other site that we haven't linked.

This is completely false. ALL modern Linux distributions come with X/KDE/GNOME and everything you need to have a fully-fonctional GUI. It gets installed automatically, too. You mustn't have look very hard...

And to those who say that X is slow, that's not true at all. X is a great protocol, some people feel the UI is not totally as responsive as Windows because X lacks desktop double-buffering - but this feature is coming pretty fast, with the compositing package developed by the fine folks at

On my Athlon 900MHz KDE with X is quite snappy.

running FC2 with 256mb ram or less
by JLR on Fri 11th Jun 2004 01:02 UTC

I have run FC2 on machines with 256mb of ram or less with no
noticeable lag -- here's how:

1. When you do the install -- do not use the graphical installer -- do the install with the text-only switch from the installer prompt -- you get the older ncurses-based redhat installer which still works fine...and uses less ram...

2. Never do a "desktop" install -- always choose either "server" or "custom" install -- this grants more leeway as to what packages get installed...

3. Your swap partition does not need to be as large as the installer says it does...Linux does need swap and by common
practice the swap partition should be at least the same size
as physical ram -- but it does not have to be so -- Most of
my systems run FC2 just fine with between 80-150mb swap --
and I have seen large swap partitions slow stuff way down --- do not know why -- but it does happen -

4. I personally have never gotten gnome to run at acceptable speeds on any pc regardless of specs -- but I have gotten kde (even kde3.2) to run fine by being careful with the bells and whistles slider setting when kde runs for the first time-- low spec system set the slider for less bells and whistles --

5. After install some gnome junk will get installed anyway... ruthlessly go through and remove it with rpm -e --nodeps gnome* -- leave the gtk and gtk+ libraries though you will need those later on...

6. Manually edit your .xinitrc file to startkde -- if you do not do this -- then redhat by default tries to start some gnome stuff even if it has been removed...

I have done these steps even on a pc with 128mb ram and kde still worked fine...

7. for test purposes install at least 1 light windowmanager
like icewm or windowmaker or fluxbox or twm etc. etc.

@Jeremy Friesner
by Matt on Fri 11th Jun 2004 01:08 UTC

Jeremy Friesner wrote "Any recommendations regarding distros to try for this?"
The folks at dynebolic will make a customer live cd distro for a very reasonable price:

and my two cents on the article: He's right that the main distros are getting fat. But there are some light distros like dynebolic which are pretty cool and light on hardware.

it's not the programmers fault, each of us who code for living
know the thruth :
it's incredbly boring to wait forever until the code is compiled so in order to gain productivity in the work chain (code, compile, debug, test) there're only 4 option :
1° code faster
2° compile faster
3° debug faster
4° have a lot of bug report goto 1

the point 4 is easy : release often (oss mantra)
point 3 too : forget about memory management : mono, java, python, perl+gtk... using of widely used libraries (glib, whatever), forget about designing clean gui code (using of tool like glade, eclipse or other)
point 1 is hard : we have only two hands and 24h a day and some of us has a life
point 2 is easy : buy a lot of ram, buy some new athlon 3600+, faster harddrive...and the all chain is accelerated... so the productivity has increased by a ten factor...

why that ?
because in the open source world, there's no gain with selling of programms, there's gain with support or other things, so the developpement has to be at the lower cost in order to go on.. it's like a vicious circle, all is free, no money to spent hours to optimize stuff, for what reason ? every 6 months speed double...
for the third world ?
we, our government, force them to borrow money to the world bank in order to pay the machine they will used.... so they'll give it back to us...

for students ? in thailand the linux computer has managed to increase the copying of illegal windows version by 100 in 6 month, in the real world only geek care about linux...
the only, only reason to switch to linux is to be away from virus and shit, and it's just a matter of time we get screwed by that too...

by anonymous on Fri 11th Jun 2004 02:08 UTC

In all my machines (P4-1.8GHz, 512MB - P4-2.8GHz, 512 MB, K7-500MHz, 320MB), Windows 98 and Windows 2000 are snappier. Further, you can run more applications on them with less memory compared to KDE 3.2/GNOME 2.6 (kernel 2.6).

I really love LINUX so even if it is not snappy, I still used it now. Yup, LINUX distros are getting better from Mandrake 7-10, Redhat 5-8?, kernel 2.4-2.6. But I hope we could make it snappier than Windows. If we do this, in my opinion, we will kill Windows regardless of current LINUX application states.

So please work on the performance areas. I believe in the open source community.

@anonymous (IP:
by A nun, he moos on Fri 11th Jun 2004 02:24 UTC

But I hope we could make it snappier than Windows.

X development is quite dynamic these days. I believe that the XFree86 implosion has given it a new momentum, and there's lots of exciting stuff on that end. On the DE front, I know that KDE keeps improving in performance (can't speak for Gnome, I personally don't use it). Anyway, expect a snappier Linux desktop experience soon.

If we do this, in my opinion, we will kill Windows regardless of current LINUX application states.

Well, I don't think that's going to change things that much, but it will certainly help. I know I'll enjoy it (though I really can't complain now).

Further, you can run more applications on them with less memory compared to KDE 3.2/GNOME 2.6 (kernel 2.6).

That's not quite true, actually. Memory usage is different for Linux and Windows. Sometimes it may seem as though more memory is taken, but that doesn't mean that all that memory is unavailable - a lot of it is cached data, kept in memory in case they're needed, and flushed if the memory is needed for something else.

So please work on the performance areas. I believe in the open source community.

Keep the faith!

by ideasman on Fri 11th Jun 2004 02:26 UTC

Everybody has an opinion on this.

Mine is that this is the fault of distro's trying to be too much, (just like MS does) GNOME & KDE both are too bloted, so dont use them.

I use icewm, firefox, thunderbird (probably a bit mem hungry)

I bypass the desktop environmant, for a filemanager I use Midnight Commander.

Now- Not everybody is like me, so..

ROX+Firefox+Thunderbird+Icewm - too_many_deamons = runs_well_on_older_box

I think GTK2 needs some speedups too.

Using Linux these days give the impression that your fast machine is already obsolete and left behind. Boot times, application execution and memory usage became extremely terrible and I started to notice this since Red Hat 6.0.

That is the article I really wanted to write myself. Something I always wanted to put out but never knew how to do it. Thanks Bob Marr did it.

Old PC
by reubot on Fri 11th Jun 2004 02:28 UTC

It's funny, but I'm currently using right at this very moment on a Pentium 200 MMX with only 160MiB of ram with KDE *and* compiling something at the moment and getting decent performance.

@Lord of Come
by A nun, he moos on Fri 11th Jun 2004 02:42 UTC

Using Linux these days give the impression that your fast machine is already obsolete and left behind.

I don't get this with my Athlon 900. to me it feels pretty fast. Note quite as fast as the Pentium 1.8 I've got at work, but really, the difference is negligible (unless I'm compiling, that takes less time on the Pentium).

Boot times, application execution and memory usage became extremely terrible

Boot times are comparable to Windows if you measure both from power on to the time when the OS has finished loading. MS does a nice trick of giving you a GUI quickly while it continues to load the system in the background. But is a few seconds here really worth all the hoopla? How often do you reboot? If I look at my current uptime: 46 days for the server (power outage) and 5 days for the workstation (new kernel), maybe Windows gives you a GUI, what 10 seconds faster? Does that really make such a difference over such longs periods of time?

I'm not sure what you mean by "application execution", but clearly you don't understand Linux memory usage. Just because the memory is shown as used doesn't mean it's not available - a lot holds cache data that is flushed if the memory is needed.

and I started to notice this since Red Hat 6.0.

Actually, in KDE's case, performance has increased with KDE 3.2 Those guys are doing a lot of optimization.

desktops desktops desktops...
by dub on Fri 11th Jun 2004 02:47 UTC

We all know that linux is a great OS for servers. There is his market for now. The old day gurus came with pretty good solutions and a great philosophy. KISS. But when we talk about linux as a desktop system, we see another philosphy. Developers aren't setting their priorities like in the past. Instead of focusing in resources they focuse in the user. Priority no.1 must be efficency, quality, and simplicity. One thing does one thing, but it does it well. Then, when everything is working really great, start adding fancy non-productive features.

by Seo Sanghyeon on Fri 11th Jun 2004 03:01 UTC

ROX is great. I recommend ROX for any machine where Nautilus/Konqueror is too slow. It has most features a file manager needs...

For the Big Two DE's it's all about the RAM
by Matt on Fri 11th Jun 2004 03:02 UTC

I know it's been said, but I just want to point out a misperception: With GNU/Linux, the perception of speed in the DE's is not affected by cpu cycles with any machine above 233mhz; It's affected by minimum RAM. I am running a cele400 with 512mb and it hums along quite nicely. I even noticed a significant "speed" increase UPGRADING from Mandrake 9.2 to Mandrake 10. If you have enough RAM, the optimizations in KDE and the 2.6 series kernel are really noticeable. Yes the RAM footprint is larger, but I wouldn't hesitate to run KDE or Gnome on an older system as long as I had at least 384mb of RAM.

by Ryan MacDonald on Fri 11th Jun 2004 03:11 UTC

This is exactly why I use FreeBSD whenever possible. It is as lean and clean as you like. FreeBSD puts everything great about UNIX into a powerful and consistent package. And the ports system is miraculous.

Even More FUD it seems!
by -=Solaris.M.K.A=- on Fri 11th Jun 2004 03:29 UTC

Wow! Just wow.

I can run linux quite comfortablu on an old laptop with a p2 266 and 96 mb of ram, running kde3.1 and the old 2.4 kernel. I can run OO just fine!

I can have all the themes I want and not viruses, spyware or stability issues!

But hay Linux is huge right? Get real! Have you guys looked at longwhore'n. The thing needs Gigs of ram!!!!!

even Xp is damn slow unless you have over 500mb of ram, so is win2k3!

More fud from the m$ camp!

I don't see the problem vs. XP
by wayne on Fri 11th Jun 2004 03:42 UTC

I'm running Mandrake 10.0 dual boot with XP on four desktops PCs with the Gnome desktop, and I really don't notice any performance loss vs. XP on the Linux desktops. At work, where my Mandrake 10 desktop is the only Linux box on an 18-station Microsoft 2000 network, the Mandrake is blazingly fast compared to XP, which is horribly bogged down by McAfee's antivirus program running in the backgound on all the Windows desktops.

I'm also running Mandrake 9.1 on an ancient Sony Vaio laptop with a Pentium 366 and 128kb of memory. Works better than fine. Recently at conference where the hotel offered free wireless, my Vaio with an Orinoco card worked flawlessly. Five out of six colleagues with XP laptops couldn't connect.

OpenOffice, granted is a bit slow compared to Office, but it really is good enough once if finally opens. Evolution is infinitely faster and more intelligently designed than Outlook. Two other programs that I use to help put food on the table - Quanta and gFTP - are better than Homesite and CuteFTP, their Windows counterparts.

K3b is easy to use and reliable for burning CDs, including images, as Roxio. Xine, Totem and mPlayer all work fine for multimedia.

When it comes to doing stuff in Windows that I can't do in Linux, the list used to be horribly long, and now it's pretty much down to Quicken, TurboTax, my USB flash key and my camera's SmartMedia card. And all this stuff that does work does so without any hacking, which I am not very good at outside of PHP.

As systems grow more user friendly, the engine under the hood gets more bloated. That's a tradeoff I'm more than happy to deal with. With AMD 2800 or Celeron 2.8 Mhz machines available for $400 and change, how well some distros run on hardware more ancient than my old Vaio is not a critical issue.

What's up with all the mouthbreathers?
by robert renling on Fri 11th Jun 2004 03:43 UTC

osx barely works with 128mb ram, try multitasking ;) (i own mostly macs, two pcs)

as someone noted win2k and winxp barely runs with that amount of ram.

What's all the hubbub about? a program .. using.. ram?

get real.

by Seo Sanghyeon on Fri 11th Jun 2004 03:44 UTC

Well, the fact you have failed to run Windows XP with RAM less than 500 MB doesn't contradict with the fact others do run it comfortably on 256 MB. I did run XP on 256 MB so I know it's possible. Perhaps you have better Linux skill than Windows skill, or your hardware fits better with Linux. But please, don't generalize.

I assumed you didn't lie.

Havoc v. Marr
by K on Fri 11th Jun 2004 03:48 UTC

Bob Marr says: "[Havoc] may have talent in writing a lot of code quickly, but it's not good code".

Bob Marr, you will note, "is a sysadmin and tech writer, and has used Linux". Oy.

I think we can all know exactly how good Bob Marr is at writing code.

re: Havoc v. Marr
by robert renling on Fri 11th Jun 2004 03:51 UTC

well the great thing about opensource is that it scales, very well, Bullshit just seems to walk.

I'm personally waiting for marr's highly esteemed report on the matter, hopefully he can provide us with some patches for the codebase aswell.

Havoc Pennington on this article
by Seo Sanghyeon on Fri 11th Jun 2004 03:51 UTC

Read this:

I'm sure we'll get a patch from this OS News guy who knows exactly where the bottlenecks are and how to provide equivalent features in 1/10 the memory, due to his extensive profiling of GNOME, detailed architecture review, and overall coding skills. --Havoc Pennington

The Junk-food syndrome: Linux articles are getting fat
by Wally on Fri 11th Jun 2004 03:53 UTC

Linux would-be "journalists" make sensationalist headlines just to get click-throughs. These articles have less and less meat to them. Often they have none at all. They cry about the sorry state of affairs, but don't say what to do about it. Often they're written by people who know nothing more about computers than where the power switch is; rarely are they written by people with any programming experience at all.

Linux articles need to get better! Because they suck! You need to write better! It's bad, bad, bad, and we can't let Microsoft win! And ... yeah! Fix it!

So......what to do?
by Chris on Fri 11th Jun 2004 04:16 UTC

Personally, I agree with the view of author. But what causes the linux platform slow? kernel? Xfree86? KDE/GNOME? apps like openoffice? or any combination of them?

Then what can be done to improve performance?

I disagree
by korbinus on Fri 11th Jun 2004 04:34 UTC

While I agree about the growing size of the two big DE, I disagree totally about the performance: Gnome 2.4 and KDE 3.2 are running much better today on my Celeron 566 than their ancesters three years ago, which where unusable on my machine. My computer hasn't evolved since 3 years.

I agree and disagree
by Darren Kirby on Fri 11th Jun 2004 04:35 UTC

The author makes some good points about the bloat of the latest RH and Mandy offerings, in fact that's why I switched to gentoo a year ago, but let's look at what's really slowing the machine down: seems to me it's the exorbitant number of services RH feels compelled to enable by default. Strip out all the cruft and your left with a fairly decent box. X is not the problem, and neither is KDE (can't speak on Gnome 'cos I don't use it...)

As far as the point about replacing the OS on all the 95/98 boxes, presumably a company would hire an inhouse, or outsourced linux professional to take care of the installation and tweaking for them.

The better issue is why so many linux users feel the need to 'sell' linux to others. Sure some big players such as IBM try to sell linux as the greatest thing since sliced bread, but that is not (nor ever was) the goal of Linus Torvalds, and a large majority of OS software developers. They just wanted to have their 'own' system to work and play with, rather than having the software they use dictated by a monopoly.

Is it just me that's content to run my linux box and feel silently superior without pushing it on every windows user?

I totally agree
by Stew on Fri 11th Jun 2004 04:51 UTC

I've been using Linux for the past year or two and I have to agree that there aren't that many benefits to using it in place of Windows. It's a fun little hobby OS, but I think I'm gonna switch back. I'll still keep it around, but I need to get some work done now. Maybe I'll play around with a Macintosh in the future.

ho hum
by Someone on Fri 11th Jun 2004 04:55 UTC

The author is spot on and I'm afraid that if you disagree, you just may be a tad retarded.

Fedora shouldn't be looked at as a viable alternative for Linux newcomers, it's a test bed for RHEL. Red Hat admits this and is concentrating on Enterprise level sales.

A typical Enterprise sized install base is relatively made up of homogenous desktops. The image/load whatever... would be installed by the IT/IS department. Trimming a distro and then cloning it to each pc would be transparent to the user. Older PC's can be used as terminals to application servers. But if you've worked in an environment like this, users always want the latest PC and really are more of a pain than the OS is.

People trying to squeeze life from an old PC will always be an issue. There are several distros designed to run on older hardware. RAM is simply to cheap to argue costs. Any PC that can't be upgraded to 256 or 384 MB is to old for a desktop to have all the latest greatest apps. Sure I had an older PC and loaded Debian on it and it ran fine, but the feature list was limited.

Windows XP does starts quickly, but it takes for ever to shut down so you still have to wait. Slackware 9.1 starts from POST to login in about 30 seconds. And Slackware is one of the easiest distros to install. It's a bit involved to configure, but this is a short learning curve.

XP's sweet spot is 512MB, enough to handle any Linux distro. X is usually always the most resource intensive application in Linux. It would be nice to opt out of the Server - Client relationship of X and run a Presentation Manager ment for a personal pc that's not ment to push a desktop to remote clients. And even in networked environments CDE, wmaker or plain X are the interfaces for remote desktop production. It doesn't matter so much in remote admin since this is a small fraction of remote uses.

These are good topics to discuss. Resource demand has been an issue that has existed for quite a while... all the way back to the first computers.

Distros should learn from the best - Debian and Slackware. Obviously there are clones of these distros that are easy to set up and run.

I'm using pretty recent hardware e.g., this machine has dual AMD XP 2000+ processors, 1 GB of RAM, Radeon 9000 and two SCSI 160 drives running a Debian Sarge test. So, I don't run out of resource. I find Debian runs better than 2000 or XP on this machine.

Sometimes though you have to look at what's using RAM. Simply because all the RAM is being used doesn't mean it's not available to the system.

g9 out

Win2k vs Linux
by James Clark on Fri 11th Jun 2004 05:15 UTC

A few years ago I tried installing RedHat 6.2 on my P166 with 64MB of RAM and RH was very sluggish. Windows 2000 ran quite fast on this machine and I could easily have 30+ Internet Explorer windows open at the same time during a typical web browsing session ;) . When Mozilla first came out with tabbed browsing I was very excited about it but the startup was extremely slow compared to IE5. And it seemed to use more resources when running as well.

OpenOffice was also dreadfully slow compared to MS Office 97 which is what I had at the time.

On low end machines it seems that Windows handles swapping better than Linux.

I just hope that when Longhorn comes out it'll be too slow for the majority of machines and that it makes Linux distros look a lot more tempting for older computers.

Right on, I've faced similar embarassment sharing Linux
by Tom on Fri 11th Jun 2004 05:19 UTC

Obviously if anyone was paying attention, Mr Marr was comaparing apples to apples, competing product to competing product not MSDOS to Linux Kernel or WinXP desktop to Linux Kernel or Linux desktop to MSDOS. He was refering to everyday use with erery day software.

Admittedly by bosses wife still will only use an old 386 with MSDOS and Corel WordPerfect ver 3.0 or something like that because that is what she learned and it works and has WordPerfect has all the general features of the new stuff. Her machine has never had trouble loading network srvices or wasted time with them because they don't exist.

But that is not realistic for eveyone else. Of course you can boot to Linux kernel and type a letter in no time but so can my boss with the 386 and MSDOS. (hows that for boot time and resource requirements)

The point is that Mr. Marr is right on and has a valid point. Linux with similar Apps should be matched or faster than XP with similar Apps. And not only that but be less prone to crash running smilar apps. Bloated code generally means slower performance and more prone to crash.

by xedx on Fri 11th Jun 2004 05:49 UTC

wtf happened here 400+ posts in what 1 day? lol

Windows and other Os'
by Robert on Fri 11th Jun 2004 07:04 UTC

The artical I read was interesting and I read a lot of good comments. I tried Linux years ago and found it hard to use, drivers were a pain in the butt, uninstalling programs was hard. It just wasn't like Windows. I hear Mac OS is easy but I don't have tons of money to shell out for Mac hardware. Way back when when there were Mac clones I was going to build one but didn't have the chance.

I use BEOS and I liked it, it was easy and a good OS. I see OpenBEos is around and maybe they'll make something of it. If their smart they'll do what I thought BEos should have done, not just marketed it as a multimedia OS but as an OS everybody can use for anything. I think if marketed right it could have given Linux a run for its money and may have even passed it. It was difinitly easier to use.

An OS should be easy for everyone, and you shouldn't have to tweak anything. Drivers should be easy to install and uninstall, everything should be easy to run and configure and thats what Windows is for me. The problem I find with Windows is after you install and uninstall programs many times it gets slower and starts to have problems. Its better with XP than it was with 9x but it still has its problems. In the artical the author mentioned other Os' which I had never even heard of, and I looked at their web pages and some lool nice. To make an OS compete with Windows it has to be easy to use and have a lot of apps and games. An OS to compete with Windows needs to give a reason to some of the major software makes to write software for it and so far it hasn't happened. Microsoft keeps pining away as everyone hopes another OS gives it a run for its money but its not happening. Apple could port Mac OS X to the X86 machines or let people make Mac clones and be mostly a software developer and make hardware for people who want to spend the huge ammounts of money and keep making other inovative things like the Ipod. IBM had its shot with OS2 but lost it. The other option is that something big has to happen to make people want to switch. Like someone making an OS thats totally voice controled or make some other type of input divice thats better than the keyboard. Unless one of these things happens we're stuck with Windows because either everything else is outdated or just to hard to configure for the average person. There is a little distro I'm going to try I hear is good and thats ELX, I hear its going in the right direction.

Windows may have its problems but its the best we have for now. It has the ease of use and the apps and games we all want.

Try FreeBSD
by Ewout Boks on Fri 11th Jun 2004 08:58 UTC

Not trying to start a flame war, but I run FreeBSD 4.9 on my 4 year old machine and it feels a lot faster than any Linux distribution out there I have seen recently. Not really suitable for beginners, but then once you run it, it is really responsive, fast and gives you a solid impression. I always have a feeling Linux starts to crack once you put the pressure on it.

Old computers are old, and the fact that the majority of Linux distributions still run reasonably well on old (not older, but OLD) machines is homage to the Linux philosophy. BUT, it is time to give the developers some freedom. If you can make a great distro that requires 192 MB of RAM, why not? There will still be other distros for people who don't have shiny new computers. Considering the average computer being sold today has at a minimum 256 MB of RAM, why worry when you are developing for a new crowd? Even average Linux users know how to tweak a system for some speed, but most newbies to Linux will have new computers, thus, not having to worry about tweaking. It's like forcing yourself to use a 15 year old cellular phone because it does everything a phone should do and anyone can figure out how to use it.

Tell it like it is
by Anonymous on Fri 11th Jun 2004 10:09 UTC

I am a (relatively) newbie who absolutely hates M$. I kept reading about how fast and efficient Linux is, but that was not my experience. I have feared that Linux was hopelessly addicted to Moores Law. I desperately want to see Linux succeed and beat the pants off of M$. But in addition to good functionality, reliability and security, it would be nice to see some good old-fashioned mean-and-lean code.

Still a server OS
by Anonymous on Fri 11th Jun 2004 10:14 UTC

I agree with the author, I recently installed SuSE 9.1 looking forward to the benefits of the 2.6 kernel but ended up reverting back to 9.0. My machine isn't that old, it's a 2.5Ghz P4 with 512 Ram and a GeForce 4 card and it was just hellishly slow to boot. The two main apps that would make or break a Windows competitor, OpenOffice and Mozilla are memory hogs and take forever to load and are so frustratingly slow in comparision to their windows counterparts as to be almost unusable. I'm no M$ fan, I also run FreeBSD on the same machine and that seems to run the same software much, much faster, so I know it's an issue with Linux (the kernel - the BSD kernel I find much simpler to configure and much faster). Neither am I a newbie, I have been running Linux since Redhat 5.2 and have also recently tried Mandrake 10, Redhat 9 and Debian Sarge and have so far been so disapointed with the bloat and slowdown of each sucessive release that I have ended up using BSD and Windows XP as my main OS's instead of Linux. It's a sad state of affairs but unless the main Linux vendors get their act together and address the speed and bloat issue would-be Linux converts will just ditch it and stick to Windows. Linux out-of-the-box is just no competition to Windows out-of-the-box on the desktop . I still have to recompile my kernel and tweak init's to get decent performance out of it, new users don't want to do that and neither should they have to (Oh, and Gentoo is not an option for new users)

Havoc and Mayhem...
by foljs on Fri 11th Jun 2004 10:24 UTC

When extracting, GNOME-Terminal uses around 70% of the CPU just to draw the text, leaving only 30% for the extraction itself. That's pitifully poor. Metacity is hellishly slow over networked X, and, curiously, these two offending apps were both written by the same guy (Havoc Pennington).

Hmmm! I once wrote in here that Havoc is the worst thing to hit the Linux Desktop in some time (in my opinion, of course), but I was reffering to his opinions on Gnome strategy et al.

Interesting to see that this also holds true in the case of code quality.

to everybody
by drfelip on Fri 11th Jun 2004 10:56 UTC

Just wondering if after 412 comments, many of them complaining about Linux performance, somebody is thinking on doing something. I mean, a "Citizens for Speedy OS" association or similar ;)

What I mean is, things doesn't change if nobody asks with enough strength for a change. If somebody is developing alternatives, encourage them, spread the word, contribute. Join other people that thinks as you do.

Re: Indeed vim
by Anonymous on Fri 11th Jun 2004 10:57 UTC

Give VIM a try, it's likely to suit your needs.

I did - was using vim like for 2 years and I must say I hate it. Vim is very kewl (fast, best syntax highlightling I've seen, customization, etc) when it comes to work on 1 file at a time. But when u have to work on multiple files it's really a bitch.

For that time I used Fluxbox + tabs and several terminals with vim inside but it's just not that fast (in terms of working) for me as does one application with tabs, where I can fast switch between documents.

I have downloaded some scripts for directory browsing, semi IDE for vim, etc.. tried even Cream but they all suck big time.

The best thing about EditPlus I like is one shortcut: F12 (or whatever you set) - it switches between last two used documents - I can't live without this feature. This is the feature I miss most in all editors for Linux.

My advice: if you have all hardware and use winblowz use EditPlus ( under Linux stick to Kate (no tabs which sux, but it's damn fast).

by Thom Holwerda on Fri 11th Jun 2004 11:16 UTC

There is one answer to all the problems Linux has on the desktop (it's doing well on other markets):



I can't believe
by Maynard on Fri 11th Jun 2004 11:30 UTC

That people are making such a fuss about Fedora recommending 256 MB, (same as windows XP) and stipulating a minimum of 192 (64 more than XP) and 3 years later too.

In 3 years, Windows XP was released, i got my frist computer and then my second for half the price of the first yet 3 times better, and then Fedora was released, and people want it to have lower hardware requirements than XP years ago. Please.

by foljs on Fri 11th Jun 2004 11:56 UTC

There is one answer to all the problems Linux has on the desktop (it's doing well on other markets):



Yes. Period. A bad case of PMS.

How on earth are "standards" going to better the issue under discussion, which is SLOW, BLOATED programs that use much memory, such as Mozilla, Open Office, DEs, etc?


Linux Gaining Weight...
by Michael on Fri 11th Jun 2004 11:57 UTC

I have a Toshiba Satellite 2710 laptop with P3 450MHz, 64Mb RAM, 30GB HDD. I am running Windows XP Pro on it just fine! I only experiance performance drop-off after I have the following running (at the same time): ZoneAlarm, MSN Messenger, FireFox (with around 30 tabs open - on Dial-Up), 3 - 4 MSN Chat Sessions, and 2 instances of Notepad.

I tried to install Red Hat Linux 7.3, Red Hat 9, Fedora Core 1, Mandrake 8 and up, and Libranet. Even after a Kernel Recomile (with appropriate changes, etc.), Minimal Services, and XFce instead of GNOME / KDE, and HDD Tuning. I had a Linux expert do the optimizations for me too.

Fedora would hang every time during the install, and all the others distros would run PAINFULLY slow. I could be waiting for ages just to open a directory in a window. I really want to use Linux, but this is my only machine (I travel allot). In a CLI setup, Linux is fast on this machine, but anything else - forget it.

It has 128MB (64 Onboard + 64 Expandable) chipset in it, but the RAMBUS is dead, so I can't upgrade, or use the 64MB Extra RAM I got a few years ago. So I am stuck with Windows XP Pro.

So much for the "Designed for Windows 98SE / NT Workstation" sticker that I peeled off the case the other day.

I agree with author!
by Paulius on Fri 11th Jun 2004 12:24 UTC

I fully agree with the author and I do know myself that Linux is getting big and slower.

I remember I once installed Debian on my computer this summer... The computer was unstable and contained many bugs in KDE.

... I am happy to report I was able to convert at work with an IBM_300gl (which is a PII-300, with 128 ram initially). I did upgrade it to 256, the hard disk is a 4.2gb and it's merrily running :
Xandros 2.0 Business Edition,
LotusNotes_5.07 (crossover office), Firefox, StarOffice and typically approx 3-4 consoles.
Kopete, KNotes, ...
And it's not _fast_ but it's not particularly sluggish. Only Notes (a win-based software) redraws a little slowly - but aside from that it's fine.
Hopefully, Xandros will stick on the straight and narrow and not follow the bloat - while still offering a very comprehensive desktop.

So the choice is still yours.



GNOME Terminal *is* slow
by Frank Furter on Fri 11th Jun 2004 12:50 UTC

Havoc claims in his blog that accusations of his slow code are false:

Specifically, that he didn't even write the guilty code in GNOME Terminal. But the problem is, the latest versions show only *his* name in the About box, *and* in the Credits list. Who else should someone blame? Recent versions of GNOME Terminal are *dirt* slow and it astonishes me that anyone who's used it for more than 10 minutes hasn't noticed it.

v lynex
by Anonymous on Fri 11th Jun 2004 13:20 UTC
GNOME Terminal
by Seo Sanghyeon on Fri 11th Jun 2004 13:35 UTC

Frank Furter wrote:
>Recent versions of GNOME Terminal are *dirt* slow and it astonishes me that anyone who's used it for more than 10 minutes hasn't noticed it.

No. If you have correct driver setup you won't notice much slowness. It is a little slow, but nowhere near to dirt slow.

kde development
by nekkar on Fri 11th Jun 2004 13:43 UTC

I'm a kde developer and I put big attentions on the speed of my applications. Moreover, I can assure you that kde speed and memory requirements (after a period of time in which they increased a lot) are getting better and better.

P.S. I have a 3 years old computer (PIII 866MHz)


RE: I agree with author!
by Anonymous on Fri 11th Jun 2004 13:49 UTC

"I fully agree with the author and I do know myself that Linux is getting big and slower.

I remember I once installed Debian on my computer this summer... The computer was unstable and contained many bugs in KDE."

Here's a performance&stability tuning tip for Debian: you can install a stand-alone window manager (like Fluxbox or WindowMaker) and run your KDE programs from this window manager. After installing the "menu" package you can right-click the desktop and a menu pops up containing launchers for all your installed KDE applications. The "menu" program automatically updates this right-click menu every time you install or remove applications. (Don't know if the RPM distros have similar "menu" program -- they SHOULD have.)

The small window managers are much more stable and bug-free than the big desktop environments KDE and Gnome. They also load faster and use less RAM. With Debian's "menu" program you can enjoy most of the benefits of the full-blown desktop environments with the low resource use of the small window managers.

If the bugs you've experienced in Debian are in the KDE applications you want to use (and not in the KDE environment), they are usually fixed if you wait a couple of days and then upgrade the buggy application. (Debian has Synaptic as an easy-to-use GUI for handling package management -- installing, removing or upgrading packages.) Note: the bugs get fixed quicker if you remember to file a bug report.

glad to hear it's not just me
by pjm on Fri 11th Jun 2004 13:54 UTC

When I first starting playing around Linux, it was Redhat, (5?) on a 486 100 with maybe 32 mb of ram. It was dog slow, but it ran. Oddly enough, it was better running programs as an xdm server, with an exceed session over a network. It wasn't something I would have wanted over win95 (my desktop at the time), but it looked promising.

Fast forward to 2004, and seeing Mandrake 10 loading on an 866 with 256 mb of RAM. Very similar results. Slow, not as much as the old redhat on ye olde 486, but still much slower at just about everything than Winxp pro installed on a different partition of the same machine. This is progress?

And in the meantime, Linux distros keep changing so much and so often that it's hard to use your old knowledge. I remember a cool utility that came with most linuxes, linuxconf. Now where the hell is it. Half the configuration tools in the Mandrake release core dump.

I think hobby OS's like Sky and Syllable have a lot more hope at being usable than any of the major Linux distros. They're all bloat, and each version adds two new bugs for every one they squash

by Seo Sanghyeon on Fri 11th Jun 2004 14:17 UTC

>Don't know if the RPM distros have similar "menu" program -- they SHOULD have.

No, "menu" system is one thing that distribution as centralized as Debian can achieve. But MenuMaker, an automatic heuristic-driven menu generator does 90% of manual job done by Debian maintainers.

Check if you want to use lighter WM. It's a must.

But WIndows XP is 3 years old
by Ben Allen on Fri 11th Jun 2004 14:24 UTC

"seeing Mandrake 10 loading on an 866 with 256 mb of RAM."

Wait till Longhorn is out, and try running that on the same PC... Exactly the same effect as running the latest Linux (6 months or something) on an old computer.

I'm not saying that Linux is faster than Windows everywhere, but it is certainly beaten in some aspects (Some applications loading times, ease of use (OK, if you came from Linux for the first time, to Windows it would be completely wierd - "So what am I suppose to do if the wizard fails?") etc.), just like Windows is beaten in certain aspects (security, WM features etc.).

Expanding on the above, You cannot say that going to a terminal is "cheating" on usability. I can't remember how many times I have had to go to the "Command Prompt" to sort things out, especially for ping, netstat etc.

think forward
by Goth on Fri 11th Jun 2004 14:31 UTC

The thing is that Linux is getting "there" earlier than everybody else, can you imagine the system requirements in OS's in 10 years?

Some of you are pretty funny
by Brian on Fri 11th Jun 2004 15:14 UTC

The article basicly says linux apps are getting feature creep/code bloat, and your response is when I run my linux apps under windows they aren't any better/less bloated. No duh they aren't faster and less bloated under windows. Linux was smaller/faster becuase it didn't try to be pretty, it got the job done with minimal warm fuzzyness. Now that is has gotten on this whole got to be pretty kick it is growing to microsoft proportions. Microsoft's code didn't get as bloated as it is from straight bad programming, it got there by adding so many prettiness features that there was no way around the size.

@But WIndows XP is 3 years old
by Anonymous on Fri 11th Jun 2004 15:26 UTC

You can actually run the Longhorn 4074 build on a P3 w/ 256mb ram. Disable the WinFS service and it's as fast as XP.

That's just cuz it's basically XP wiht a sidebar though! ;)

by Aleksander Stukov on Fri 11th Jun 2004 15:30 UTC

I don't believe that for a moment. I run XP Pro on an Opteron with 512M. If I'm running more than one program, it can take as much as a minute just to flip windows between programs. From my experience, XP needs at least 1G of RAM to run comfortably with multiple programs.

Really?. I cant believe this dude!. How can u say Xp is slow with monster like this?. My Athlon XP 1900 runs great with 768 Mb of RAM, even when I startup both Oracle and MySQL servers. I also run Gnome@SuSE 9.0 and well, its pretty laggy but it keeps same laggy after starting plenty of programs. OOo, Multiple instances of Firefox, Xmms, Yast2...

One thing is true. I upgraded my Athlon XP 1900 from 256 to 768 MB RAM, and let me say one thing, XP with 256 RAM works like sh*t. XP is a VAMPIRE, it is really a VAMPIRE, just try to open up Media Player 9, ha, relax. Well Its true that my HDs are 5400 rpm but I just cannot understand how Micro$soft supports XP with 64 RAM, not even a 6 months old W98 works fine with this.

SuSE 9 on my machine works fast, I have nothing to say about it.

A. Stukov

good article
by rikard on Fri 11th Jun 2004 15:44 UTC

I totally agree with the author. It has gone too far. A linux desktop should first of all be stable, efficient (performance wise) and of course usable. Eye candy and fancy features should not be first priority when choosing design/coding methods. In other words we should not do the same misstake as microsoft has done. I don't remember exactly cause I haven't been using mandrake or redhat/fc for a couple of years. But, then, I believe I saw something which actually tried to automate the updating by periodically searching for updates. Causing the system to slow down. I remember thinking that this stinks of microsoft ideas. Let's agree on some linux values. Lets's focus on what we say we're good at. Namely stability and performace. Some people say it's not the desktops who are at fault. And that it's X that is the culprit. Ok, so there you have it. Why not gather forces and fix X?

I think I understand....
by Scottw on Fri 11th Jun 2004 16:17 UTC

I chose to add Linux systems at work because rather than buy new machines with win2K and pay for MS liscences, I could use Linux to make my retired PC's act as efficient servers. As an windows admin with no Linux experience I needed the GUI to figure out what I was doing, so all our linux boxes run with the GUI. Well, we just decided to drop further Linux installs because of reports Fedora ran too slowly with the GUI on machines like our last batch of older Celeron ~700mhz systems with 128MB.

Unfortunatley Linux needs to run on old systems or I suspects most small businesses will not have a case for using Linux at all.

Brother, you`re right!
by Edsson Moraes on Fri 11th Jun 2004 16:40 UTC

my name is Edson and i'm a systems architect and developer who works for the Government in Brazil. I really have to tell that this is one of the best articles i've read recently. Here there's been a lot of effort to migrate to Linux, because of the the high license fees that Microsoft software demands. I myself, tried to migrate totally to linux slackware 9.0 from my Win98 software. Because i don't wan'na buy a new computer. But Gnome 2.2 was so slowish with my box ( k6 400 - 160 Mb Ram - 15 Gb HD) that i gave up. I really would like to get free of windows but i can't afford using these modern distros. I still remember when is was at college and Linux ran very fast in my compter which made fall in love with the system, besides other beneffits. the third world countries are a big market for linux but these new distros won't make their inroads in these markets with such heavy hardware requirements.

Concerning 3D acceleration
by Finalzone on Fri 11th Jun 2004 16:52 UTC

Remove any 3D driver on both Windows XP and any Linux distro. Now do better comparison. It seems some posters cive an unfair comparison when games like Quake 3 on Windows XP with optimized driver runs faster than Linux version without optimized driver.

by Nicholas James on Fri 11th Jun 2004 16:58 UTC

How on earth are "standards" going to better the issue under discussion, which is SLOW, BLOATED programs that use much memory, such as Mozilla, Open Office, DEs, etc?

Still standards would be a good thing.

by Ben on Fri 11th Jun 2004 17:24 UTC

I agree somewhat to the premise of this article, that the mainstream Linux distributions are getting fat to an undesirable degree; Leaner is better. However, the arguments posted here amount to systems requiring a fair amount of RAM (256MB), rather than systems needing upgrading on the whole. That's hardly unreasonable considering the latest "version" of Linux has come out three or more years after XP, and IMO is vastly superior. I reckon that in the next couple of years the main Linux desktop systems and applications will become leaner, so it's a matter of an OS that can compete with Longhorn on functionality and usability yet requiring the resources of XP. That doesn't seem dire to me.

For my d*ck-waving contribution, I'd like to say that Linux 2.6/KDE 3.2.2 with all the apps runs circles around XP on this machine (P4 2.8GHz, 256MB).

The biggest detraction from this article was the name-calling of Havoc. I don't know anything about him, but it must be bordering on libellous (and downright unprofessional) to throw a name in like that? Especially when it appears that the remarks were false.

Linux Desktop Benefits = Temporary Phenomenon
by Tom on Fri 11th Jun 2004 17:35 UTC

Let's face it the benefits that Linux desktop provides over Windows XP, etc. must only be temporary. If, as many supporters seem to suggest, Linux Desktop achieves a market share comparable to Windows, how long will it be before users demand the very same functionality that bloats XP and introduces inherent security vulnerabilities. The reason spyware and virus don't affect Linux is not because they can't but rather because hackers don't see the value in infecting less that 1% of the market!

by Finalzone on Fri 11th Jun 2004 17:38 UTC

About the "bloated" Linux distros like Fedora, Mandrake or Suse, you can remove softwares you don't need. It is shameful that people fail to notice that many Linux distros have more applications available for installation than Windows XP. In that sense, I agree with the author Linux (here distro) is getting fatter. However, with the same PC specification (512 MB ram), I was impressed the speed of installation of Linux(Fedora Core 2 as model) with extra packages than Windows XP with extra packages(witout MS Office).

To do a better comparison, install Linux as server without Open Office and extra packages to compare with Windows XP. Both will got a fair comparison as they will run with the similar configuration>

RE: @ tom 17:38
by Finalzone on Fri 11th Jun 2004 17:46 UTC

Apache is popular than ISS yet has very few attack. So your argument that hackers attacks the popular distribution is erronous. The real reason for large amount of spyware and virus in Windows is how long it takes to fix the flaw in closed sources than open sources. With this, hackers won't have enough time to fully exploit flaw in Linux than Windows.

by Kingston on Fri 11th Jun 2004 17:47 UTC

I've been saying that Fedora and Mandrake are bloated for a while now. You think that just because you've posted an article about it that the fanboys will pay attention now?

Best of luck to you.

by Finalzone on Fri 11th Jun 2004 17:57 UTC

You can customize your own distro, can't you? Was not Linux design with customization in mind?

Was not Linux design with customization in mind?

No. Linux was made with Linus' 386 in mind. You are forgetting history in order to make a point. That never goes over well.

It comes down to compromises.
by Matthew M. Copeland on Fri 11th Jun 2004 18:07 UTC

If you look at the history of linux distribution development over the last few years, you will notice that many distribution creaters have been playing the catch up game. They are trying to catch up to the Microsoft environment in terms of having the basic set of applications. This has caused a lot of compromises.

The first that comes to mind is OpenOffice. When it first was really available to Linux users, it was incredibly slow on older hardware. It hasn't really gotten any better, just most systems have gotten faster, so it is less noticed. During that same time, GNOME and KDE both had full development going on for there own office suites. Both of these office suites were usable on old hardware. It was at this point that the compromise came in. It was decided that OpenOffice would be used. This slowed down or killed the development on many other projects that would have had the same functionality but have been orders of magnitude faster on older hardware and very fast on our current hardware.

As I sit here writing this on a P2-266 with a 128 meg of ram running FC1 in mozilla, I can tell you personally that I never open any OpenOffice applications on this computer. It's just to painful. One of my first tasks when I got the system installed was to kill of gdm autostarting. I like GNOME and X, but it's a killer on this old machine if I'm running FC1, but I'm reminded, Red Hat is no longer concerned with the issues of normal desktop home users. The Fedora Core series is not meant for that any more. It's meant for the new business PCs, and only in the buyable form of Red Hat Enterprise whatever. As a user of Red Hat Linux, since version 2.0 I think it was, I'm beginning to give serious consideration to thinking it is time to move fully over to a different distribution that is more inline with what I'm looking for both home use and professional use.

by Seo Sanghyeon on Fri 11th Jun 2004 18:29 UTC

OO.o 1.1 is noticably faster than 1.0. "It hasn't really gotten any better" is an understatement. Still, Gnumeric is very well done with all functions in MS Excel and lightning fast recalculation engine.

Well, on the other hand
by jeff whitehouse on Fri 11th Jun 2004 19:16 UTC

On the other hand, QNX and BeOS run blazingly fast even on very old machines. It's great to *say* everyone should use 256 megs of RAm, btu no, everyone shouldn't, why should Joe Average even have to buy a new computer every 5 years? I can run BeOS5 fine with 32megs of RAM and a P166. Windows Xp runs sluggishly with 64 megs of RAM and a celery 500, but it's not as bad as you peopel make it out to be, it's actually quite usable. I had 128 megs in this 1.4Ghz box until quite recently, and *gasp* I didn't upgrade for OO1.1, or Firefox, or even windows, I upgraded to play morrow wind. Now seriously, any half clean install of windows will run fine with much less than you people claim.

My Experience
by tKC on Fri 11th Jun 2004 22:24 UTC

I currently have a 1.2Ghz Athlon with 512M ram. I did try XP Corp I'll admit, but it ran HORRIBLY. It crashed so randomly and frequently (the BLINK/reboot type of crash, not BSOD), I was beginning to think I had hardware problems. I figured I'd slap Gentoo on there and see how things work. I initially ran KDE and it ran very comfortably (more-so than XP did on my hardware (btw, 4MB video card)), though not as responsive as I enjoy my desktop, so I went with an old favorite: Fluxbox.

Current state: Firefox/Evolution/Azureus, and a few other memory-hogs, and it will run them all at once VERY VERY quickly. I did do a few things to optimize performance, but I really can't express just how wonderfully responsive and stable my machine is.

I say, who needs 3Ghz/1G ram just yet? I consider myself a huge tech. fanatic, and don't need high-priced hardware to remain one.

From the Windows XP web site at

233-MHz minimum required
64 MB minimum supported
15 gigabyte (GB) of available hard disk space
Super VGA (800 × 600)
CD-ROM or DVD drive
Keyboard and Mouse

like ive said twice before...i set my mother's computer up to do it with 128 ram and a 350 processor...runs great with messenger, ie, websters dictionary, and wordperfect all running at same time. turned off the crap and dont use an antivirus - who needs one with a good firewall and all the updates anyways? go to one of the antivirus web sites once a week and check your system online...its just as easy to reformat and start over if you got a virus you couldnt get rid of...

anyways this article was about the noob distros being fat(and as ppl have pointed out,hard to get all that you want working), not the ones like debian and slack so those ppl can quit posting...just my 2cents

RE: who are you people (to the anti-windows crowd)?
by Anonymous on Sat 12th Jun 2004 02:19 UTC

If winxp runs on those specs, it will be dog slow. Also, you need to buy a decent anti-virus program As Soon As Possible. Shame on you for doing such a dirty trick to your mother!

is there a thread i can start about that?
by ben weaver on Sat 12th Jun 2004 02:50 UTC

antivirus software really isnt needed running 24/7...just like any linux distro , windows xp is pretty secure with a good firewall, updates, and some common sense web browsing...

by the way, i wish i could visually proove this all to you ppl, maybe you can go buy a system for 10 bucks like i did and try it for yourself, trust me, xp works just fine on older systems.

Try to make XClients!
by Giovanni Nunes on Sat 12th Jun 2004 03:43 UTC

If the problem is a lot of old hardware is better turn it all in simple X11 clients and runs applications and desktop environment in a "powerfull" application server! By the way I used a simple Celeron 700Mhz host to serve X for ten old Pentium MMX 166Mhz machines. ;)

You are forgetting history in order to make a point. That never goes over well.

Thanks for correction about history. However, my point about Linux being customizable (sp?) is still valid in this present.

Graphical apps are too bloated
by Anonymous on Sat 12th Jun 2004 10:52 UTC

I have used Linux as my main desktop for 5 years. Sometimes I need to boot Windows to use some specialist software. It is a pleasure to use Windows (until it crashes) because its apps are so responsive. I never thought I would say this (because my other computer is an Amiga 1200) but Windows apps are fast and light. Damn it, MS Word under Wine under Linux is faster than OpenOffice - showing that it's not the kernel, or even X11 that's the problem, but the multiple layers of bloat that sit on top of that. Every major Linux application seems to require its own GUI toolkit, and its own component object model.

I cannot recommend Linux to a friend who runs Windows 98 on a 200MHz PC, because it would be impossibly slow - even if he upgraded to the maximum amount of EDO RAM that his computer can use. Windows 98 is fast enough on this machine with 64Mb, but unreliable.

Required Reading for Linux Developers
by Anonymous on Sat 12th Jun 2004 10:55 UTC

This thread should be required reading for software developers, distro vendors, and those who pretend that we can stop the 3-year hardware upgrade cycle by switching from Windows to Linux.

Linux speed/Mem usage
by Aaron on Sat 12th Jun 2004 12:21 UTC


Most distros come with lots of fluff turn on by default.

I have just one PC...

P3 600MHz
256MB ram @ 100MHz
30 GB Weston Digital PHD 2048KB cache @ 5200RPM
Nvidia G4 MX420 64MB

The first encounter with Linux was Corel Linux SE
it was very slow. About 120 seconds to open Kate. My first reaction we that I had too many things loaded at boot.
Turning them off did not do much at all. I returned to DOS 7.1 (I deleted Win 98 after the constant crashing even with a default setup with no additional software installed)

The second was Redhat 7.2 It was great. There was some long 60 second pauses but it was not often. Back then I could not install nvidia's old RPM drivers and Wine was not good.

The distro I use now as Suse 9.1 I am on the same hardware but there is no long pauses it seems to work as fast as Windows XP. There is no crashing. Wine and Nvidia have improved vastly. And the games under wine do not shudder
like they did in windows.

Note to make you linux distro faster.

1> Turn off services that you do not need.

2> Use a good FS like ReiserFS or Jfs or XFS

3> The Kernel 2.2.x is (*IMHO*) is a pain try to use a newer one if possible. Give compiling your kernel a try.
a. make oldconfig
b. make RPM
c. Double click on it and follow you distros

4> If you have 256MB or more of Ram use KDE/Gnome/?
If you have 128MB or less use a lighter WM

Just my
2p + VAT

Kernel opimization 2.6.x
by Aaron on Sat 12th Jun 2004 16:25 UTC

Another thing you can do if you have gcc 3.3.3 or newer
is compile the kernel by using the optimise for size this
will make the kernel image smaller so there is less cache misses

by Maxei on Sat 12th Jun 2004 16:41 UTC

This author has done a good courageus work: to tell the truth; but there are some bastards out there (as always) who antagonize with this truth: Linux, really, IS GETING SLOWER AND SLOWER. Take note GNOME team; GNOME has always been known for being a bloat. People has spoken, the alarm is sounding, dont be deaf or stupid; what can we do, users? talk to programers. And this article is very welcome.

by Aaron on Sat 12th Jun 2004 16:50 UTC

I am no a programmer yet, but I think the teams should do
an audit of there code and remove code that is no longer
in use. This should save a lot of memory

bravo to THE GIMP community
by Maxei on Sat 12th Jun 2004 16:57 UTC

Photoshop is an incredible hog for memory. In windows 98 with 380 or so MB RAM and PII, it is unbearably slughish that it is impossible. Yet, look this incredible The Gimp. Under similar conditions, it beats Photoshop in speed. Lets hope that this champ keeps lean and dash, one of the best usefull tools for linux

Re:bravo to THE GIMP community
by Aaron on Sat 12th Jun 2004 17:00 UTC

Gimp is a great program but it has a lot less options
that photoshop (although the extra options do sometimes
seem useles)

by Aaron on Sat 12th Jun 2004 17:06 UTC

Think should make the next version of the server more lean. I have heard many arguments brake out over X. When running Gnome on Redhat 9 it was often using 149MB of ram when I was just in the desktop with no applications loaded. It is getting better now though as now it only uses 64MB.

Get a better computer
by deathKermit on Sat 12th Jun 2004 17:13 UTC

If its to slow, upgrade your hardware. A bit more RAM, a faster hardrive (going fro 5400rpm to 7200 is definatly noticeble) or heaves forbit, give your CPU a mild overclock, I have my AMD CPU running ~400Mhz above stock, on the stock cooler, and it is rock solid stable.

OH lets see
by painter on Sat 12th Jun 2004 18:35 UTC

OH-kay, lets see, in the spirit of competition, since many
3-6 year old computers are still out there, who will step up to the plate? microsoft, redhat, suse,mandrake, freebsd to some of us, an old computer is like an old classic car if it
works fine we keep it.New computers will always sell. make the product as you see fit But what a market opertunity if
you dont forget the "65 GTO" of computers that ran fast on low octane gas!!!!

Linux's future
by Anonymous on Sat 12th Jun 2004 19:18 UTC

Linux's future looks positive for the long term, more and more gaming software developers are prodecing great games for both Windows and Linux, the security benifts of Linux are getting more people to convert and the price means no more breaking the bank everytime a new version of the os comes out.

However its whats happening now that doesn't look soo great, now with the comarison with windows ill tell you my little story, I have an old p1-233 computer with an intel chipset with that auful limitation of only caching the first 64meg of ram, and windows seems to fill ram from the top down, so if you have 128meg ram, the computer will run slow until it fills up 64meg of the ram. For a little experiment I installed windows XP Pro with 192meg ram, The computer ran, however tasks didn't run ver well mp3 playback would be choppy (this also happened with 98) when restricting the system to 64meg of RAM, the performance went up in some ways, and XP still ran if you could cope with large start up times for programs. I was surprised by the speed none the less. For a computer to be used just to surf the net, chat on instant messaging and some word processing/emailing it is fine.

It goes to show that microsoft, when they release an OS it is targeted to work with systems that are 3 years old (XP runs smoothly on a P2 with 128meg of ram) and Linux distros used to be even better than this. Now it seems that the distros that are easy to use like windows are reguiring more system resorces than what they really should. Its one thing to say RAM is cheap, however when DDR Ram becomes obsolete and you only have 1 gig of it, and this oporating system is asking for 4gig you will end up throwing away a perfectly usable computer just becouse the os is not optimised.

If Linux wants to grow as an operating system it needs to get over this hump and start considering the people who do not have the money to be continually pgrading there computers (its enough that games keap requring newer graphics cards that cost you $200 a year) we do not need are oporating system making it harder and it also being the CPU and RAM needing constant upgrades

As a gamer I am forced to use windows however even since windows mirged its stable NT OS with its main os line I am finding myself spening all my time in windows are relativly none in Linux as what I need is ofered in both os's yet windows stats quicker, better hardware compatibility (although Linux is alot better now, driver support is still not as goodas windows, yet its getting there) and the main apps I use load/run quicker. I do not need all the junk that comes with mostly all Linux distros, then again I dont need all the junk that comes with windows however atleast it doesn't effect performance as much.

To sum up, Linux is growing yet it is not getting any better, its great that free software can do soo much however unless Linux distros stop trying to make it easyier to use and look great with all that unneeded trasparency it will slow down in growth and again be only used by people who need its server capabilies that set it apart from Windows. Ease of use does not come from how good it looks, and im sure most people will prefer something that looks bland and runs quickly than something that looks awsome and lags while you are listening to your HDD crunch away.

(Linux distros will get past this, windows will overtake them again in the prize of slower OS yet how long is it going to take?)

RE:santhosh & for all that ask Where can I find old RAM?
by daveiro on Sun 13th Jun 2004 17:55 UTC

You can find all types of old RAM here:

by helf on Sun 13th Jun 2004 19:15 UTC

Funny seeing all this talk about windows and linux running slow on X hardware config.. I'm running beos r5 on a 166p with mmx and 64mb ram. screams.. ;)

My faster pc (1ghz cel, 320mb ram) runs beos like a dream. I havent had windows or linux loaded on it in ages. Just couldnt stand not having REALLY responsive GUIs..

Hey kids, I'm Mr. Portability!
by Wrawrat on Sun 13th Jun 2004 23:08 UTC

I can't believe that only ONE person mentioned portability as an issue.

Yes, Windows XP and OS X are faster. But are they portable? Not at all. They are probably using hundreds of tweaks hardcoded in ASM to speed up some things. We had a proof in the past when WMP was faster with P3s because Microsoft hardcoded some routines for that platform. Forget CPU flags detection: if it was a P3, they used it. If it wasn't, better luck next time, even if the AthlonXP/4 did supported the SSE instruction set.

Oh, yes, of course, there was a time when NT4 ran on four different platforms. But that's old story. Today, Windows XP can run on two different platforms than x86: x86-64 and IA64. x86-64 is basically an extension of x86. I doubt they spent thousands of men-years for porting it. As for IA64... Look at the number of things that isn't working out of the box:
Yes, even WMP is not supported at the moment (note that this list might be outdated: I can't really confirm anything on it as I don't own an Itanium and I don't know anybody that know anybody that know anybody that have one either).

GNOME/KDE runs on many platforms and many operating systems. There are probably some tweaks for each platform but I doubt there are as many as in XP/OSX. They simply can't do them as it would be way too hard to develop and debug. Combine this to GCC that isn't the most efficient compiler on Earth and you get what you get. Note that I don't really bitch the GCC team: they are doing an incredible job. Still, many commercial compilers are faster because they use patented techniques and stuff like that.

Yes, most developers are not spending enough time in optimising and profiling. Yes, it's sad that DEs and applications are taking more and more RAM. Yes, the users have the right to bitch on the quality of the product... yet I don't think they have the moral right to bitch the developers directly unless they can do something better. And many users are probably still wondering if the light in their fridge stays on when they close it. Do you think these users would code something better?

I understand that most people don't want to get directly involved with Linux to see it improved... but honestly, if you're not happy with the current state of Linux and are not willing to improve it, stop bugging us and go with something else. Contrary to popular belief, Linux is not there to destroy Microsoft and Apple. It seems that many people just want something for free. You give the foot, the want the leg.

Like some regular posters here said, I wish there was a registration system. The signal-to-noise ratio is getting horrible and the moderation is craptacular.

This is very sad...
by Evgeniy Lotosh on Mon 14th Jun 2004 04:13 UTC

There is one more problem: a localization. If you want Linux boxes to be bought outside of US and England, you need to localize them. Typically it is a business of small companies which work with the most widespread Linux versions such as RedHat. They would NOT work with other versions until they become widely used. It means that only overblown Linux distros will be localized. Keep in mind that usually PC boxes outside of US and Europe are significally weaker then those in US.

I'm not Linux user, but I carefully watch the progress in this area. Unfortunately Windows XP is still much more adequate for the common office tasks then Lunix boxes. IMHO, of course.

P.S. WindowsXP + OfficeXP + IE6 run smoothly on PentiumIII-800/128 Mb RAM. 256 Mb is recommended but not required.

Linux or Windows?
by we2by on Mon 14th Jun 2004 04:48 UTC

I have a machine with a p3 700mhz and 512mb sd ram. I know how to use windows and linux. Now, what do I put on it? a Windows OS where I can get infected with viruses and trojans within a few hours? or install linux on it that's more stable and I feel save when shopping online? And no! I have no money for buying those ****** virus scanner!
currently I ahve FC2 installed with gnome 2.6.0. gnome is working pretty nice, much better than kde.
by the way, FC2 is free! I won't spend my money for the windows license.

thank you

blah blah blah
by randy on Mon 14th Jun 2004 06:03 UTC

Within the comments are people saying :

My XP w / 128 Mb of ram runs good or My XP w / 512 Mb of ram swaps constantly or My Linux system "feels" slower than my XP running 128 Mb of ram , BLAH BLAH BLAH etc....

THESE ARE ALL OPINIONS, PERIOD. Without real analysis and comparisons with different hardware setups, then it is an opinion and not based on facts.

If the author would have compared 2 or 3 computers with different hardware setups and then benchmarked XP vs RH9 (or whatever OS) running similar apps or with benchmarking software, then and only then would this article be worth the bandwidth it took up.

He might have simply wrote:

I FEEL like Linux is slower than XP, but I do not have 1 bit of research supporting my opinion.

or he could have wrote:

end of story.

That bout sums it up.

Price & performance
by lezo on Mon 14th Jun 2004 07:24 UTC

Don't forget about one small difference between MDK, FC2,Slackware and Windows on the other side. You have to buy Windows and others are available for free. Remember that fact before complaining about MDK perfomance and Slackware user-unfriendliness.

by kosta on Mon 14th Jun 2004 08:16 UTC

Yes this is true! Windows XP RAM consumation can be minimized to 80M with antivirus and firewall - what can you do with new X's?

by kosta on Mon 14th Jun 2004 08:20 UTC

And what you think, that Ive payed for mine XP?:-D
All new distros - mandrake, suse, are very "BEAUTYFULL" and free, but that is all....

Other option
by I on Mon 14th Jun 2004 13:57 UTC

The article mentions options of than Linux but says
that they are all immature. In reality, there are
mature options too. Up to two months ago I was running
eComstation 1.0 on a Pentium 90 w/ 64MB! Now I run it on
a faster machine (Athlon 1600) but it so much more usable
than Linux. I installed Fedora on the same machine to try
things out, after the seeing it was still orders slower
than eComstation I removed Linux all together. When it
doesn't run as fast as eComstation I feel I'm wasting my

For those who don't know eComstation, check out:

It usable for all my needs which are pretty basic. Sure
some unusual h/w don't have drivers yet and you don't
have the same apps as in Windows (Work,Excel,etc.) but
you DO have options. In any case 95% of my use is
Mozilla which I think is pretty common these days.

- I

I'm ashamed...
by Charles on Mon 14th Jun 2004 16:14 UTC

I'm a Linux guy (been pure Linux for 10 years now) but I'm ashamed by some of the comments here. So many comments are ridiculing the author, and saying memory is cheap, trying to define the problem away, etc etc etc. If you don't face up to the problem, you have no right to throw stones at Microsoft.

Linux really is fat. This has been bothering me recently. I love Linux in some ways, but absolutely hate it in others.

Look at my supposedly "minimal" getty that is shipped with my distribution:
root 852 0.0 0.0 1484 356 tty6 S 07:27 0:00 /sbin/mingetty tty6
1.4M of virtual memory, 356K in core. And this is MINIMAL? A version with similar features but linked against dietlibc would consume just 20K. If we truly are skilled technical people, why are we not ashamed of this? And that's just the tip of the iceburg. Why are we shipping binary packages that include documentation in /usr/share/doc that talks about how to build it? Not only is that needlessly bloating the disk footprint, but it looks unprofessional.

I blame the distributions. We shouldn't just be packaging and repackaging the same ol' shit. Software has a life cycle. At some point, it gets too old and brittle and bloated -- REWRITE IT. I'm currently working on a "diet-coreutils". GNU's coreutils takes 7 meg on disk, currently. I'm shooting for similar features in 400K.

Why are we shipping distributions with assertions turned on? Why are we building with -O2 rather than -Os? (Small code == better cache usage == faster.) Why are we packaging and shipping garbage how-to-build files that the end user won't care about? (Disk bloat == more in RPM database == slower installs.) Why are even the most trivial programs linked against the monstrous glibc (with over 6,000 symbols.) There is low hanging fruit that distro packagers could easily grab here. Why is no one bothering?

I'm currently repackaging my otherwise-favorite distro, to fix these sorts of things. I hope to embarass it into improving...

Come on, people. We can do better than this.

OpenOffice and Mozilla vs. LyX and Firefox
by Andrew Ray on Mon 14th Jun 2004 19:32 UTC

I'm running on a 550MHz 384 MB RAM machine, and for the most part, things run with a decent smoothness. I don't do KDE or Gnome though. I've found that firefox seems to run a bit smoother than Mozilla for web-browsing. Also, as OpenOffice is really slow (although, I'm using the pre-compiled binary), LyX does a really nice job for typing up reports and the like and does hog nearly as many resources as OO. I also run Gentoo though. Although, Gentoo does give a step by step installation guide and isn't really so hard to get up and running if you just follow the directions. (At least not on x86 desktops, my Mac 6500 seemed to be a bit more annoying.)

Whats the worry
by Amit on Tue 15th Jun 2004 15:18 UTC

Of late there have been lot of concern that Linux is getting fat. On one hand there are people complain that it is slow to run new distro's on a "older" system. On the other hand the there are people who feel that it is not linux, rather people who are configuring the system properly.

One thing that is very clear from the whole debate is that linux has come to an age. It is now able to deliver the applications to the users who want all the "cool" apps to run on there system and donot care a damn about the how much RAM it takes till the time it runs fast. Also it caters to people who have "older" configuration m/c and want to make good use of that. All that is needed is a smart installer that can detect the configuration and advice the user what to install and what not to install for performance on a particular system. This way every body is able to use their m/c's to best.

After all it is all about choices. Isn't it?

RE: I don't believe you...
by Anonymous on Wed 16th Jun 2004 07:12 UTC

I am running XP Pro on an Athlon 750 MHz w/ 384 megs of RAM. I have 30 IE windows currently open, Opera 7.21 running with 54 tabs open , calculator.exe running,
paint shop pro 4 running, caller ID program running, a newsgroup program running, notepad.exe running, 3 windows explorer windows open, one file search window open .... and you know what? I am not experiencing any slow-downs or lag in performance due to swapping... my system is still fast and responsive (I don't think Linux with GUI can do that, at present). At least I'm not feeling it when I'm working in my apps or switching from one app to another. Even when I do switch to another app, and XP does have to access the swap to bring the program into memory, it is very minimal and not totally unbearable, unlike my experience with Linux. I bought a computer with Lindows pre-installed, Athlon 1400+ CPU, 128 megs of RAM, and that machine ran slow when I had AOL, and 3 Netscape sessions running. The delay for waiting for the disk swapping was unbearable. For example, when I switched from one Netscape window to another, I had to wait over 5 seconds (felt more like 30 seconds) for the disk swap because when the disk swapping happened, it'd freeze the whole computer. I couldn't stand the slowness and loaded XP Pro onto that machine, and now it runs fast, and I don't have to deal with the long waits for disk swapping I did in Lindows. Even with 128 megs XP Pro ran fast on that machine. Everything I've reported here is true, and not sarcastic. I'm rooting for Linux, but there are still a lot of issues that need to be worked out, speed problems included.

As for the comments by someone else below, they must be living in another universe, cuz that's not what I'm experiencing with XP Pro.

I don't believe that for a moment. I run XP Pro on an Opteron with 512M. If I'm running more than one program, it can take as much as a minute just to flip windows between programs. From my experience, XP needs at least 1G of RAM to run comfortably with multiple programs.

I'm not talking monster programs either. I'm talking about FireFox, Total Commander, and maybe something like Azereus. The disk thrashing on 512M is HORRENDOUS in XP Pro. By comparison, FC2 on the same machine is many times faster and more responsive.

The article is just FUD, and so are some of the responses. Lets hear a little truth for a change instead of blind astroturfing.

by ronaldo on Wed 16th Jun 2004 21:52 UTC

i have celeron 400 and 128mb ram.
i runing winxp pro. emule, opera, shareazza, 5x mirc, winamp, tvtuner software, ivisit, ftp server (argosoft), vnc wiever.
and what ?
winxp pro runing great. all is working nice. i have it runing for weks, i never stop it. but its true that i optimized it, closed unnedeed windows services. (it stop by itself when are no electricity)

and people who tell that you need 256mb to run winxp normaly, those are linux people, and dont know anything about windows xp.

by ronaldo on Wed 16th Jun 2004 21:56 UTC

i running kerio firewal and nod32 antivirus.
and people that hate microsoft/windows. please stop this. hate is not nice.
love will win the hate one day!

Damn Small Linux
by Damn Small user on Thu 17th Jun 2004 16:51 UTC

Try Damn Small ( ) it is specifically made to be light and fast -- uses fluxbox, sylpheed, Ted, Dillo, etc.

by p0indext0r on Fri 18th Jun 2004 19:13 UTC

After reading just about all of these posts I've come to realize that just about everyone is misusing the word Linux... but then again I know it's being used in a gerneral sense.

I feel the core of these problems lay within the X code... Linux without the gui runs on just about ANYTHING. A terminal with no gui running Linux doesn't take up much system ram or cpu.

Fix the Xserver and optimize the gui (gnome,kde,all the apps) and then look at the results.

I know we can just run fluxbox or xfce4 or some other minimalist WM, but even xfce4 takes up about 100+megs while in use and not idle. But we also don't want to cut back on features just to get speed. As this defeats the whole purpose of comparing it to Windows.

One again. Fix the Xserver and optimize the gui (gnome,kde,all the apps) and then look at the results.

Well, not on mandrake
by Chris on Sat 19th Jun 2004 03:37 UTC

I've been using mandrake for awhile now. Prior to what I have now (AMD Athlon XP 2500+, 256 MB RAM) I had an old Dell (Pentium 2 400 Mhz, 128 MB RAM). I was using Mandrake 9.0 with this old machine and it ran like a charm, much faster than XP, which ive never thought of as getting faster (Win 98 is WAY faster than XP, albeit much buggier). My XP boot takes 4-5 min while Mandrake takes 1-2 and there's no ads or security issues, blah, blah *really good stuff about linux here*. You get the picture. I frankly couldnt read this whole article since the first page was enough. I think if it's a big issue that the user should simply buy more ram (its $40 for 128 stick and it will greatly increase your speed). I'm sorry if im ranting but I just couldn't belive that he couldn't run programs on fedora with 128 MB. The processor is not the problem i think and neither is the OS...RAM!

He's right! Linux is not what is was meant to be anymore.
by Pierre Oxenryd on Sat 19th Jun 2004 16:38 UTC

I am actually running XP these days, just because of what we've just read. I don't get half the performance under any of the newer Linux distros I've tried, and they're beginning to pile... I have tried many distros, and, actually, NO, linux did not convert me. All distros I've tried just gave me headaches and troubles, even though I learnt very well how to handle it.
For me, Linux have become an OS only for entusiasm. I, as well, am still dreaming of the OS that Linux ones claimed to be...

Just for Kicks
by leftcoastlogik on Sun 20th Jun 2004 00:31 UTC

Last Fall I picked up an ancient IBM P90 with 32mb SIMM and a 500MB HDD for $5 at a garage sale. It still had the OEM Windows 95 OS loaded on it!

For the next month I tried to install one GNU/Linux Distro after another including Vector, Mandrake, Slackware, Gentoo, RH. All were set up to install with a GUI but otherwise with only the bare essentials. No success.

Just for kicks, I tried installing Windows 98SE and guess what? It installed on the very first try. Eventually sold the old box on eBay for $35.00.

Moral of the story: Side by side, Linux is not leaner and lighter than Windows. It is more versatile and adaptable in my opinion and ultimately superior for those reasons.