Consider these memory requirements for Fedora Core 2, as specified by Red Hat: Minimum for graphical: 192MB and Recommended for graphical: 256MB Does that sound any alarm bells with you? 192MB minimum? I’ve been running Linux for five years (and am a huge supporter), and have plenty of experience with Windows, Mac OS X and others. And those numbers are shocking — severely so. No other general-purpose OS in existence has such high requirements. Linux is getting very fat.
I appreciate that there are other distros; however, this is symptomatic of what’s happening to Linux in general. The other mainstream desktop distros are equally demanding (even if not as much as Fedora, for example Arch Linux or Slackware run Gnome on 128 MB, but not very comfortably when you load 2-3 apps at the same time), desktops and apps are bloating beyond control, and it’s starting to put Linux in a troublesome situation. Allow me to elaborate.
A worrying tale
Recently, a friend of mine expressed an interest in running Linux on his machine. Sick and tired of endless spyware and viruses, he wanted a way out — so I gave him a copy of Mandrake 10.0 Official. A couple of days later, he got back to me with the sad news I was prepared for: it’s just too slow. His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower. Not only did it take longer to boot up, it crawled when running several major apps (Mozilla, OpenOffice.org and Evolution on top of KDE) and suffered more desktop glitches and bugs.
Sigh. What could I do? I knew from my own experience that XP with Office and IE is snappier and lighter on memory than GNOME/KDE with OOo and Moz/Firefox, so I couldn’t deny the problem. I couldn’t tell him to switch to Fluxbox, Dillo and AbiWord, as those apps wouldn’t provide him with what he needs. And I couldn’t tell him to grudgingly install Slackware, Debian or Gentoo; they may run a bit faster, but they’re not really suitable for newcomers.
Now, I’m not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I’m just being realistic — they should still run decently on hardware that’s a mere three years old, like my friend’s machine. If he has to buy more RAM, upgrade his CPU or even buy a whole new PC just to run desktop Linux adequately, how are we any better than Microsoft?
Gone are the days when we could advocate Linux as a fast and light OS that gives old machines a new boost. BeOS on an ancient box is still faster than Linux on the latest kit. And to me, this is very sad. We need REAL reasons to suggest Linux over Windows, and they’re slowly being eroded — bit by bit. Linux used to be massively more stable than Windows, but XP was a great improvement and meanwhile we have highly bug-ridden Mandrake and Fedora releases. XP also shortened boot time considerably, whereas with Linux it’s just getting longer and longer and longer…
Computers getting faster?
At this rate, Linux could soon face major challenges by the upcoming hobby/community OSes. There’s Syllable, OpenBeOS, SkyOS, ReactOS and MenuetOS — all of which are orders of magnitude lighter and faster than modern Linux distros, and make a fast machine actually feel FAST. Sure, they’re still in early stages of development, but they’re already putting emphasis on performance and elegant design. More speed means more productivity.
To some people running 3 GHz 1G RAM boxes, this argument may not seem like an issue at present; however, things will change. A 200 MHz box used to be more than adequate for a spiffy Linux desktop, and now it’s almost unusable (unless you’re willing to dump most apps and spend hours tweaking and hacking). In those times, us Linux users were drooling over the prospect of multi-GHz chips, expecting lightning-fast app startup and super-smooth running. But no, instead, we’re still waiting as the disk thrashes and windows stutter to redraw and boot times grow.
So when people talk about 10 GHz CPUs with so much hope and optimism, I cringe. We WON’T have the lightning-fast apps. We won’t have near-instant startup. We thought this would happen when chips hit 100 MHz, and 500 MHz, and 1 GHz, and 3 GHz, and Linux is just bloating itself out to fill it. You see, computers aren’t getting any faster. CPUs, hard drives and RAM may be improving, but the machines themselves are pretty much static. Why should a 1 GHz box with Fedora be so much slower than a 7 MHz Amiga? Sure, the PC does more – a lot more – but not over 1000 times more (taking into account RAM and HD power too). It doesn’t make you 1000 times more productive.
It’s a very sad state of affairs. Linux was supposed to be the liberating OS, disruptive technology that would change the playing field for computing. It was supposed to breathe new life into PCs and give third-world countries new opportunities. It was supposed to avoid the Microsoftian upgrade treadmill; instead, it’s rushing after Moore’s Law. Such a shame.
Denying ourselves a chance
But let’s think about some of the real-world implications of Linux’s bloat. Around the world in thousands of companies are millions upon millions of Win98 and WinNT4 systems. These boxes are being prepared for retirement as Microsoft ends the lifespan for the OSes, and this should be a wonderful opportunity for Linux. Imagine if Linux vendors and advocates could go into businesses and say: “Don’t throw out those Win98 and NT4 boxes, and don’t spend vast amounts of money on Win2k/XP. Put Linux on instead and save time and money!”.
But that opportunity has been destroyed. The average Win98 and NT4 box has 32 or 64M of RAM and CPUs in the range of 300 – 500 MHz — in other words, entirely unsuitable for modern desktop Linux distros. This gigantic market, so full of potential to spread Linux adoption and curb the Microsoft monopoly, has been eliminated by the massive bloat.
This should really get people thinking: a huge market we can’t enter.
The possibility of stressing Linux’s price benefits, stability and security, all gone. Instead, businesses are now forced to buy new boxes if they are even considering Linux, and if you’re splashing out that much you may as well stick with what you know OS-wise. Companies would LOVE to maintain their current hardware investment with a secure, supported OS, but that possibility has been ruined.
Impractical solutions
Now, at this point many of you will be saying “but there are alternatives”. And yes, you’re right to say that, and yes, there are. But two difficulties remain: firstly, why should we have to hack init scripts, change WMs to something minimal, and throw out our most featureful apps? Why should newcomers have to go through this trouble just to get an OS that gives them some real performance boost over Windows?
Sure, you can just about get by with IceWM, Dillo, AbiWord, Sylpheed et al. But let’s face it, they don’t rival Windows software in the same way as GNOME/KDE, Moz/Konq, OpenOffice.org and Evolution. It’s hard to get newcomers using Linux with those limited and basic tools; new Linux convertees need the powerful software that matches up to Windows. Linux novices will get the idea that serious apps which rival Windows software are far too bloated to use effectively.
Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users — the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don’t pay any attention to speed, and it’s giving a lot of people a very bad impression. Spend an hour or two browsing first-timer Linux forums on the Net; you’ll be dismayed by the number of posts asking why it takes so long to boot, why it’s slower to run, why it’s always swapping. Especially when they’ve been told that Linux is better than Windows.
So telling newcomers to ditch their powerful apps, move to spartan desktops, install tougher distros and hack startup scripts isn’t the cure. In fact, it proves just how bad the problem is getting.
Conclusion
So what can be done? We need to put a serious emphasis on elegant design, careful coding and making the most of RAM, not throwing in hurried features just because we can. Open source coders need to appreciate that not everyone has 3 GHz boxes with 1G RAM — and that the few who do want to get their money’s worth from their hardware investment. Typically, open source hackers, being interested in tech, have very powerful boxes; as a result, they never experience their apps running on moderate systems.
This has been particularly noticeable in GNOME development. On my box, extracting a long tar file under GNOME-Terminal is a disaster — and reaffirms the problem. When extracting, GNOME-Terminal uses around 70% of the CPU just to draw the text, leaving only 30% for the extraction itself. That’s pitifully poor. Metacity is hellishly slow over networked X, and, curiously, these two offending apps were both written by the same guy (Havoc Pennington). He may have talent in writing a lot of code quickly, but it’s not good code. We need programmers who appreciate performance, elegant design and low overheads.
We need to understand that there are millions and millions of PCs out there which could (and should) be running Linux, but can’t because of the obscene memory requirements. We need to admit that many home users are being turned away because it offers no peformance boost over XP and its apps, and in most cases it’s even worse.
We’re digging a big hole here — a hole from which there may be no easy escape. Linux needs as many tangible benefits over Windows as possible, and we’re losing them.
Losing performance, losing stability, losing things to advocate.
I look forward to reading your comments.
About the author
Bob Marr is a sysadmin and tech writer, and has used Linux for five years. Currently, his favorite distribution is Arch Linux.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
Yelp is so much faster in Gnome 2.6 ShaunM must be a god!
The games I’ve played so far, which have a Windows and Linux port so far are Quake3, UT, UT2k3, Heroes 3 and Tribes2.
To be honest, especially in UT2k3 and Tribes2 I get not only better framerates, but also when I’m playing over a network (T2 singleplayer is a joke anyways), usually I have a much more stable and smooth game, connection wise. With Q3 and UT, the framerates in Win and Linux have been pretty much equal.
I wouldn’t go so far as to say that Linux ports are generally better, but saying that they are worse is bollox.
I’m hoping for more native Linux ports in the future, but only Doom3 and HL2 seem to go that way :/
The beauty of Linux is there are many ways to improve the speed. The default install of any distro will install a generic x86 kernel. If the author really wanted to help his friend, he should have helped him learn how to recompile the kernel for his specific platform. I have an old AMD K2 running Debain unstable with KDE. Ever since I recompiled the kernel for that specific processor, it has performed perfectly.
when I first started using Linux, around 1998 with Red Hat 5.2, its system requirements were way lower then Windows NT or 9x. It was very comfortable on my Pentium 133 mhz with 48 megs of RAM, while either Windows was lesser so. What the hell happened? GTK? Gnome and KDE going feature-crazy at the expense of RAM? Its nuts…
Why don’t you send Linux to /dev/null and install FreeBSD instead? Stop crying about Linux getting fat. If don’t like the way things are going, switch for something else and shut up instead of waisting time writing about how Linux got fat.
I used to use Mandrake 9.1 on my old Pentium III 450Mhz with 64Mb RAM. It FLEW on it. It even supported 3d acceleration and I had a lot of fun with it. Heres some hints.
Disable any unecessary services.
Don’t install both KDE and GNOME. Choose one
Make sure you enabled DMA on your drive
Don’t use Nautilus on GNOME versions before 2.6, it sucked in speed before then.
Get OpenOffice.org 1.1.1, the 1.0 version was extremley slow.
I now have a Atlhon XP 2000 with 768 Mb ram. It absoloutley flies. It even outperfoms my brothers 3Ghz PC running XP.
I still run SuSE 9.0 on my Duron 800mhz laptop with 128Mb. A hell faster than Windows XP on it, plus it fits nicely in a 6Gb partition that was somehow “convienently” left empty by SONY.
P.S. Look at the requirements for the longhorn betas, you will cry!
Who says you need 256MB or more to run Linux? Unlike Windows XP or Windows 2000, you can remove parts of Linux and install whatever you want.
Recently I built up both RH 9 and FC1 with bare minimum settings on a Pentium III 850MHZ machine with 384MB RAM and on an Athlon 2000XP+ with 256MB ram. Stripped it down to the essentials, upgraded the core components, then installed only what I needed or compiled it from source code.
I do agree that Linux is getting more and more disk intensive but you can still control it.
my mother needed a machine to check email, chat with friends, and read her old corel wordperfect files. i bought her a “new” p2 350mhz with 128mb RAM, 4GB HD, and a 4MB Rage Pro card for $10. She already had a 15 in monitor from her old computer. after reformatting the hard drive i installed xp and the latest wordperfect. threw some extra tweaks in and it runs great! i mean i was suprised. now i could never had done that with a linux distro and expect her to know how to do anything. i wonder why everybody else is having all this “disk thrashing” they are talking about with old systems and xp.
>Don’t use Nautilus on GNOME versions before 2.6, it sucked in speed before then.
This is again a good point. pre-2.6 Nautilus did MIME sniffing instead of looking at file extensions. I am sure you could turn off MIME sniffing in pre-2.6 Nautilus though… But it was not the default.
This article is speaking from the newcomer’s point of view. When you look at Windows XP, are you thinking of Internet Explorer? Likewise with Linux, A newcomer’s point of view is the “Total Package” even when it’s clearly not. To adopt GNOME/KDE, U must adopt Linux in some way shape or form. So in actuality, Linux IS getting FAT. I love the Bells and Whistles of KDE and Gnome, But let’s face it, Those WMs are a Pure Hog. And for Linux to make it in the Desktop realm, there really have to be work done on X, and The Big 2 WMs. Or… Have an Indie come out with a brand new WM that embraces all of those features and tehn some, but not being such a Pig. Or just break out De Ole Serial Port and have some good console luvin.
This article was well written, well paced, relatively typo free and balanced. I am greatly pleased by the article in both form and content. I myself have felt that Linux was slower than it should be (as is Mozilla) and I fully agree with and support every thing this author said. I notice that there are over 200 comments posted already so I assume there is an amazing amount of religious argument going on. I will have to check out the comments when I have time (I hope it isn’t a waste of time – I sometimes find myself wondering why I bother).
I hope a great number of developers seriously consider this article’s points because they are valid. Not just valid: right on target.
I bet this will be the first Osnews story ever reach 1000 commnets.
…that all those swearing by Fedora Core have 256 MB of RAM? This makes me feel out of the loop. The older Linux distros work fine for me on the desktop (that’s what FC is aiming at, apparently). There are also tons of drivers for these older distros. I see no reason for FC to be taken seriously for a corporate desktop – it placed itself out of the market by such steep requirements.
This made me so frustrated, because Linux != OS. Linux is the kernel, like kernel32.dll is to Windows.
The FC2 *Distribution* is slow, yes, I agree. Especially on older hardware. But that’s because it loads GNOME and a ton of other things in the background. So I wrote this short counter-article:
http://www.livejournal.com/users/punkwalrus/13900.html
I know many of you have upgraded your hardware to 128MB or 256MB, and a minimum 500MHz CPU, but wait, let’s review the points that this article points out. The author means there are still tons of people having their 200MHz with 64MB boxes staying home and cannot upgrade no more because it is not worth to upgrade, so what’s the alternatives of os because windows are not working very efficiently on those boxes even Win98 with modern apps. How can you satisfy with running 4 apps only with 128MB and an os without anything included? When did this RAM-hog revolution started? 4 apps and 128MB? Even linux, I am not saying linux is bad, like someone in the posts saying that it’s the apps problems, not linux itself. IMO, he’s so right because I was running Debian with 2.4.22 kernel with KDE 3.0 on a Duron 600MHz with 128MB, without KDE, if I run ICEWM, and something else, it’s lightning fast, if I run KDE 2.1, it is still quite fast, in terms of fast, I mean I run KDE 2.1 with xmms, amsn, gaim, mozilla 1.5, openoffice and 3 or 4 Kate running for debugging my codes. Seriously, if I have a 1GHz CPU with 128MB RAM, I would expect to run maximum 4 apps smoothly in XP Pro, but at least 10 apps in Linux without problems.
Someone pointed it out already but I would like to mention again. After I installed Debian, it was booting fast enough, it doesn’t boot anything that I don’t really need, but SuSE 9 and Mandrake 9.2, they are loading tons of background that I don’t really need them. And they are booting very very slowly comparing to my Win2k box. This is ridiculuos. I really appreciate the performance of Linux, I wish I can contribute to the improvement, wait for me guys, but it may take more than 50 years…haha
Featurewise, the Linux desktop (particularly under KDE) does more than the Windows desktop. I’m thinking of io-slaves, advanced theming, better security, and greater scope for user-scripting (DCOP/DBus). This takes more clock-cycles and more memory.
Since 2.2 KDE has been using less and less resources. This is from developers, who know a lot more than how to run “top”. Qt4 will be speedier still. I don’t know about Gnome, so I won’t waste time with uninformed conjecture.
X isn’t actually as big and bad as many people like to think, particularly with regard to memory, and once the X.org clean-up gathers momentum, it’s going to be even less of an issue. However a lot of toolkits don’t use X in an optimal way (there was an article on OSNews about this a while ago).
I remember using KDE 0.99 on SuSE 5.2 in college back in ’98, the only decent browser for Linux was Netscape 4.7 (or possible it was 4.65 back then) and it took up about 30MB. So don’t get started on Firefox’s usage, particularly given that it’s a dramatically better browser.
The fact is people wanted more features from the Linux desktop, and the devs delivered. It quite probably takes up more resources than strictly necessary, but not significantly so in my opinion. Windows XP may not be as much of a resource hog, but frankly the bits and pieces we hear about the Windows OS family doesn’t give me a lot of faith in the development process that led to that performance.
Barring the lack of configuration tools, with a bit of tweaking (e.g. creating a KDE desktop shortcut to “system:/” called “My Computer”), Linux offers an extraordinarily nice desktop experience which betters XP’s in several respects. Things such as multiple desktops, easy mime-type editing, tabbed browsing, spellchecking in web-forms, popup blocking, secure worry free email, document previews in file-managers, printing to PDF/fax/email, easy inline compression and encryption and the IO-Slave (or VFS) architecture are all things that add value to the user.
As for the need to cut down, the devs already know this, and are working on it. In the last year, KDE has reduced it’s resource usage, a ton of work has been done on desktop responsiveness in the kernel, and Qt has announced performance enhancements for Qt4, not to mention lots more. If the article were a comment it would be marked as flame-bait. Instead we get as sub-slashdot run of whinging and trolling.
Wait a second, since when is one distribution considered to be all of Linux? Isn’t Fedora the distro being reviewed here? Why in the world would someone make an all encompassing statement like that?
Why don’t you take a look at Gentoo or Debian or Slackware. I personally use Gentoo and I know that it runs circles around Fedora in terms of speed.
So please, lets remember that Linux is still nothing but a kernel. If Fedora folks decide to fill it with bloated software, then lets acknowledge that instead of saying that Linux is getting fat!
Looking at the upcoming Debian Sarge on the mailing lists, the following requirements are needed
Processor : 2Ghz
RAM : 512MB
Hard disk. 5 Gb min install, 25 Gb full install
Comes on either 13 CDs (all mandatory), plus 30 “Extra” cds or 7 DVDs. They even plan to sell 250Gb hard drives with it pre-installed to lighten the load.
I also cringe at the upcoming GNOME 3 that will be due in 2005. This is their answer to long horn. It will need at least 768Mb (1.25Gb to run comfortably and up to 3 to run at full potetial). It also requires mandatory 64-bit processors with at least 2500Mhz. Thats only currently avalible on highlyl overclocked opterons or the new G5!
For Debian and GNOME. It is going to be huge! I hope someone knocks some sense into those idiots.
I’m here to tell you my brothers FC2 install on a 1800+ and 512 MB ram with some cheap mobo and slow hard drive is faster than my Gentoo install on a 2200+ with nice hard drive and expensive mobo… this is because i cant prelink gentoo correctly i believe… but his is still faster.
It’s all so ignorant, between blaming “Linux” or “X”…Has nothing to do with either. Run “top” and see what is using the memory. KDE and/or GNOME are the “problem”. Sort by memory usage with a capital m in top to see this.
QT and KDE with KDE/QT 4 are getting between 15 and 25 percent smaller memory footprints, so if you are too stingy to upgrade and you want to use KDE, then just wait. Otherwise, use Windowmaker, Fluxbox, IceWM, Blackbox, etc. etc. because Gnome/KDE are not shrinking anytime soon.
Besides, this “Special Contributor” is just plain wrong.
1. Memory is cheap. It might be slightly more than at this time last year, but go back 5 years to 1999, and consider the difference. I honestly don’t understand why anyone runs any OS on less than 512 MB of Ram. 512 is the sweet spot wherein your PC starts happily pushing and pulling bits out of memory, instead of having to spin the HD up to hit virtual memory all the time. A gig or higher is cooler yet for us power users (My x86 box is at 1GB, and as soon as I have the $$, I’ll be at 3GB, My Mac will be up to 2.5GB here ASAP as well).
2. Most modern programmers are lazy, and have never learned, nor had to learn due to the nature of modern hardware, how to wring the last few cycles our of a CPU.
Back in the days of the Amiga (No, not this swan song called OS4, but the real Amiga), and early x86 days, it was every programmers duty to get as much as possible out of the hardware. This meant that they had to know the hardware as well as the software. This isn’t true anymore (unfortunately).
Todays programmers learn to program using established IDE’s, and they do so using modern, “make it easy” type languages. Most programmers couldn’t handle assembly if they needed to, nor do they know the ins and outs of the CPU and associated hardware.
Thus code is optimized as much as their compiler knows how to do so. They don’t go “outside the box”, so to speak.
To be fair, the advancement of PC technology has been so brisk these last 10 years, it’d be hard for a lot of people to keep up with the changes in technology. The comodization(sp?) of todays programmers is largely a result of taking all these advancements, and leaving it up to a select few to understand them. The rest will simply use the development tools that these select few code for the world.
The long and short of this is that todays code expects to be run on a modern PC, loaded with a fair amount of ram. I’ve always argued that 128mb is too little for anything over Win98, and people have always been amazed at the increases in speed, as well as the overall increase in stability that 512MB or more brings to the table. Anything less is simply crippling the performance of your PC.
3. Linux is not an island. All of the above also applied to Linux. Sure, Linux enjoys a higher percentage of “geeks” that do know the hardware well, but the majority of your applications are being put together by those same “cookie cutter” programmers. The fact that they’ve chosen Linux as their platform of choice is good, but along with the genericism of the programmer comes bloat and higher PC requirements.
I agree that there are many instances where a program or OS is simply too bloated, even when considering the above, but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic.
Welcome to the 21st century friend! By a halg gig of memory for your friends 600MHZ box for about $50-60, and you’ll be amazed at the difference!
Instead of complaining about it
3.
It’s a shame that so many of you think this article is a troll, some very good points are raised. The fact of the matter is that modern Linux desktop environments are slow as hell. I love Gnome, but I wouldn’t even concider running it on anything less than a P4 with 256 megs of ram.
However, there is hope. The beauty of Linux is choice. Fluxbox and XFCE4 do a great job of doing a lot with few resources.
There are three things that can happen now.
Outcome 1: KDE and Gnome can keep adding more features and bloat with little to no consideration of performance. Mono or maybe Java will be integrated into these DEs to make rapid development possible with a tremendous performance hit. People who don’t need all the fluff will switch to a distribution which offers an alternative desktop environments as the default. Gnome and KDE will lose popularity.
Outcome 2: KDE and Gnome will reach a “feature plateau” where it is comparable with Longhorn and then buckle down and do some serious optimizing. With Novell backing Gnome and trying to replace Windows in a corporate setting, this is looking more and more likely.
Outcome 3: People get fed up with this “Linux thing” and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.
Although you might not be able to tell from this post, I love Linux. My 1 GHz laptop with 384 megs of RAM runs Gentoo and Gnome 2.6 great with all the bells and whistles. My server, an old 400 MHz PII with 192 megs of RAM runs Gentoo with Fluxbox adequately.
Then “don’t do that” (load the latest and greatest distro). _Fall back_ a few versions – damn – even RH7.3 is not a disaster…can’t be worse than MS security-wise.
I agree, technically, that the well-known GNU/Linux _distros_ (I consider the kernel a separate component of the whole system) are cramming more and more software into their offerings and the leading desktop _environments_ (GNOME, KDE) ‘seem’ slow to me no matter what hardware I run them on. I can say, after many years of using and admin’ing PCs and networks that _speed does count_. Users appreicate _far more_ that things work fast and are “usable” (which includes app response times) than whether the machine is i486, i586, P4, etc., or even what version (OS, Office) of this-or-that is running.
Keep in mind that “slow” and “fast” are very subjective experiences to the user. Only hard benchmarking (would we all agree that apps that start up with subsecond responses to be “fast” ?…what, exactly, is your “fast”…?) I’ve worked in terminals for years (VAXen, HP3000, *NIX, IBM360, etc.) – to me, ANYTHING that responds in greater than 3 seconds is “slow”…but that’s my _subjective_ (collective) experience. I will not wait more than two seconds for an app to respond. I use the GNU/Linux Distro/WM combo that gets me the _response_ times I want. I will tweak sometimes, but try to avoid it.
I’ve tended to stick with Slackware as it seems easy to install and is flexible. Swaret for updates works well. The “all packages” installation for Slack only recently moved onto a second CD (for GNOME and KDE). To me it’s still the best “tightest, all around, runs on anything” distro. There are other good ones though: Vector, College, Arch, DamnSmall, etc. Then, there’s always the BSD ‘family’ of OSs…
My current main machine: Dell Latitude Xpi, 133Mhz; 40Meg RAM, Slackware 9.1, Fluxbox _and_ wireless networking (!) via Lynksys (orinoco) PCMICA card. Xterms load in 1 second (as many as I can fire off), mutt loads in 1 second, gvim starts up in two seconds. I’ve done NO tweaking to this unit – it was ‘install and go’. I web browse with Lynx (starts in a second). Mozilla and XMMS actually WILL run on here once started, but do not respond to my liking overall to my liking…but again, that’s me). I can’t get DOS or W98 to respond like this on this LT or the apps aren’t there, so for me GNU/Linux is truly a “step up” and is using this machine to it’s fullest.
My take is that we _stop worrying_ about _converting_ desktop users from Windows to Linux – do you think for a minute that Linus cares or is worried about it ? – go after the other _billion(s)_ (of) people who don’t have a machine at all. You don’t switch a friend or relative to free software OS’s to prove what IT ‘chops’ you have, you do it because (and ONLY when !) it’s easier, faster, less expensive, etc. (i.e. the “right” reasons). If GNU/Linux is not a _step up_ from MS software then there’s no reason to move to it – isn’t that, ultimately, what you’re getting at ?
“Why don’t you take a look at Gentoo or Debian or Slackware. I personally use Gentoo and I know that it runs circles around Fedora in terms of speed.”
If you actually read the article, you’ll see the writer discuss that. Gentoo, Debian or Slackware are NOT suitable for newcomers. We may be able to use and tweak them, but newcomers are going to get a bad impression from the “friendly” and ultra-bloated distros.
“Then “don’t do that” (load the latest and greatest distro). _Fall back_ a few versions – damn – even RH7.3 is not a disaster…can’t be worse than MS security-wise.”
It is worse though. Red Hat 7.3 is unsupported (the as-yet unproven community legacy projects aside). Meanwhile, Win2k is still supported.
You can buy an OS from Microsoft that only requires 64MB: Win2k. It’ll be supported for a while yet. With the friendly desktop distros, you can’t do that; you install SUSE/Mandrake/Fedora and it requires more resources, and won’t be supported as long.
This is a problem facing Linux
i don’t understand why someone expects distribution released in 2004 work with hardware of 1999?
run os from 1999 and you will be fine. stupid “article”
quite long from the day i have used *nix as my entertainment, here we ve read a good writing like that. and almostly it s true.
thanks for the warning of obesity of linux codes. Keep the core of programs as classic as possible.
“but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic.”
No it isn’t! You think everyone in every country can just buy sheds of RAM like that? You think laptop users aren’t restricted in what RAM they can add? You think businesses want to keep buying more RAM for 100,000 boxes to run this Linux thing, which we’ve advocated as better than Microsoft?
The author made a good point: there are millions of 32 and 64MB boxes in companies around the world. Linux should be providing them with an opportunity! But they can’t run KDE/GNOME/OpenOffice/Moz because these apps are so bloated. A market for Linux lost.
“By a halg gig of memory for your friends 600MHZ box for about $50-60, and you’ll be amazed at the difference!”
Why should I have to? Why can’t programmers actually THINK about performance and elegant design? Is that too much to ask? This is the point many people on this thread are making. Chucking more and more RAM at a problem doesn’t make it go away.
“i don’t understand why someone expects distribution released in 2004 work with hardware of 1999?”
Why is that so much to ask? Videos from 2004 work on 1999 video players. Music CDs from 2004 work on 2004 CD players. Fuel from 2004 works in 1999 cars. It goes on and on and on.
Why SHOULDN’T software work on slightly older machines? Yeah, there’s no need to make it cater for old 386 boxes, but telling people to upgrade every five years is terrible. And it’s bad for Linux against Microsoft. Win2k works fine on 1999 machines, and it’s STILL SUPPORTED. All the Linux distros made at that time are NOT supported now.
As has been said, this bloat is causing slow takeup in the corporate space. Now companies are forced to upgrade 100,000+ machines to run Linux, just like they have to with Microsoft. That’s APPALLING.
troll? What an apt name for you. Debian Sarge minimul install 5 GB? Please, don’t write such a blatant lie. I have GNOME, KDE, XFce, Mozilla, OpenOffice.org fully installed and it is under 2 GB here.
Yes, Sarge will be 13 CDs. But only the first CD will be mandatory. Debian has been always installble with a single CD.
True, Linux is kernel and not OS. But talking about distros, Red Hat/Fedora, Suse, Mandrake all are getting bloated. I had P3 866 Machine, 128 Ram and Red Hat 8 was slow on it and 9 was almost unusuable(after turning off all crap). So was Mandrake. BSD was fast only without DE. I currently use Vmware and run Linux on it. But to my horror, most distros are slow to run on Vmware as well no matter how much mem I give to them. Only option seems running Debain/Slackware on it. Unfortunately, I have not been able to Install debian on it. Can anyone point me out how to install debian on Vmware?
is just the insane memory requirements. 2.6 is even worse than all before. :/
>>BeOS wasn’t cool for no good reason. BeOS can make 6 year old hardware feel fast. <<
Not my experience, not by a long shot. I still have my boxed beos 5.0 pro editon. No only did I find that it wasn’t any faster than windows, the setup was absolutely awful – primitive even for it’s day.
Been a long time, but as I remember, everything was centered around floppy disks. I had to make all of these floppy disks, and I had to reboot constently. Also, I think I needed windows just to install beos.
BeOS didn’t feel any faster than windows 95, or windows NT 4.0, to me. When I read posts raving abpit beos, I just have to scratch my head.
I guess I won’t be getting invloved in this discussion. Is this the most amounts of comments on an OS News story ever?
Maybe you did not have accelerated drivers for your graphics card? Makes hell of a difference.
*cough*AOL*cough* 😉
In my experience, this is very snappy, and takes way less than 100mb of hdd space to install. Add ms-office-97, and you have a very snappy system, that can do about 100% of what most office workers need to do.
Still the linux zealots carry on about how linux can you leverage your old hardware.
Since when did memory get cheap?
Did I miss something?
Sorry, but this is my first ATI offering. It’s fast on linux too – none of those crappy open source drivers. I’ve been using Nvidia since the Riva 128 days. Riva, TNT-1,2, Geforce3. The quake series were always faster on windows.
I’ve been saying that for the past 2 years: Linux IS ACTUALLY very fat! At least as an average user might see it. Your article is objective addressing this point! Congrats!
As someone pointed already, it is true that Linux is about only the kernel, but that does not mean a thing when it comes to normal users. Actually, it means nothing even to advanced users, for we can do nothing with ONLY the kernel!
When I started with Linux, I runned more friendly distros, like Mandrake, Conectiva or Rad Hat. Yes, they were extremelly low and bloated! I kept asking myself why in hell would this OS come with so many text editors!!!! In Mandrake I could count 9 of them just out of the box!!!
Those kind of problems made me move to Debian, so much lighter, i wouldn’t know why. I got used to hack into scripts and read logfiles. So I stick with Debian since then and my desktop is 100% Debian. But I must say it is powerful enough: even Debian is slow in low-end machines! And don’t expect average people hacking stuff or wanting to learn this nerdy stuff. They just want their OO.org to open in about 2 secs, not those 5 min I have already seen!
It doesn’t matter how fast new hardware is coming out, or it spec’s. It doesn’t matter how cheap 256 MB of RAM is.
I’ve got a stack of PII’s here at work after our recent rollout, and it’s a pain to install recent disto’s on them. I want to show my boss the value and older hardware and linux, and spending a week tweaking out a distro or buying hardware upgrades, no matter how cheap ain’t going to do it.
The most absurd thing is the installers. I can run SuSe 9.0 on some of these (with a fwwm or something), but the fancy-dancy installer doesn’t like less than 128 MB ram. That’s just silly.
Outcome 3: People get fed up with this “Linux thing” and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.
Switching to FreeBSD won’t make much difference if you are after a workstation. Your choice of window manager/desktop environment will dictate how responsive the computer feels on old hardware.
I can relate to this article. I have recently installed Fedora Core 2 + gnome and kde on a vaio notebook with 128MB of RAM. It wasn’t pretty. I was hitting swap very often. Windows XP was more responsive on this configuration.
I have since got rid of FC2 and installed FreeBSD + windowmaker on the notebook. It feels much faster now, mostly because I am now using a lighter window manager.
Well, I can relate to that myself. I had a quite good little system set up on NT 4 and Office–150 MHz Cyrix with 32MB RAM IIRC. It was quite usable. But the only place you can get NT from these days is “some guy” or eMule… Actually I just did a Google and found some places actually sell it, but £92 seems a bit steep for a computer worth half that.
Outcome 3: People get fed up with this “Linux thing” and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server.
What a stupid plug. You were complaining that KDE and GNOME are too bloated. They will be just as bloated if you run them on top of BSD instead of GNU/Linux. And what the heck does your server have to with this discussion?
Yes, Linux (particularly the windowing systems) are now more hungry than Windows. I’m glad it’s being recognised at last.
I have an old Pentium 100/128Mb which runs Windows 2000 and XP – slow, but usable for my kids. I’ve tried both SUSE 8/KDE3 and Redhat 8/GNOME and both are unusable.
“but complaining about needing 512mb to run a modern OS in todays world is just being unrealistic.”
No it isn’t! You think everyone in every country can just buy sheds of RAM like that?
In every country, no. But I still feel it’s unrealistic top expect a modern program, on a modern computer, to provide optimal performance with less. If you have to run on a low end PC with less ram, than use older software that’s optimized for that architecture!
I tried to explain why in my 1st post: Todays programmers simply are not focused on optimizing code for older PC’s. They are not taught this, and with new PC’s costing $300-$400, there’s not a lot of reasons for them to locate an old, underpowered box just to optimize their code for. Sorry, but that’s the facts. I’m not saying this to upset you, just stating the way that it is.
If you must run old hardware, then use more forgiving softare. People printed books, magazine, solved problems, and generally lived their lives around such software as Win98, Photoshop 5, PageMaker 5, and so on. You can’t expect the newest software to run on the oldest hardware, so why not just face it and buy software that is optimized for your system? If you need to do more than this type of software will allow, you need to buy a new PC or upgrade your current one. Programmers around the world aren’t going to just change their methodology because some of us refuse, or aren’t able to upgrade. Complaining about it won’t change that fact, so you must either adapt, or change.
Or learn programming yourself and show us all how it’s done!
It’s amazing how many people will complain but won’t step up and try to solve the problems for themselves.
The author made a good point: there are millions of 32 and 64MB boxes in companies around the world. Linux should be providing them with an opportunity! But they can’t run KDE/GNOME/OpenOffice/Moz because these apps are so bloated. A market for Linux lost.
We disgree with this point: There is just no way that you are going to get an X display, in addition to a GUI as robust as KDE or Gnome, as well as a modern app such as Mozilla or OO to run in 64mb’s or less of memory. Forget it… It’s just not do-able.
I agree that a whole market of hardware may go unsupported, but if you’re running 32 or 64 mb’s of ram, you just have to face the facts. If you go to any major tech. school and say “I need you to start teaching all of your students how to program so that their software runs on this 200mhz PII, with 32mb of ram”, you’re going to get laughed out of the place!
I understand your point, but the world doesn’t stop advancing just because some of us can’t or choose not to advance with it. That’s just being unrealistic.
Look at cars: They don’t stop developing new technologies and parts just because they can’t be retro-fitted to my old 1978 Old Cutlass. They expect me to upgrade to a newer car if I want such modern features as air bags, anti-skid brakes, an on-board computer and such.
I either live with the old Cutlass and it’s shortcomings, repairing what breaks as it breaks, or I bite the bullet and buy a newer car.
The same goes with old PC’s. There’s a wealth of older, albet unsupported software out there which was used by millions of people around the world when it was considered modern. Learn to use it, or upgrade so that you can take advantage of the newer technologies that are now available.
Why should I have to? Why can’t programmers actually THINK about performance and elegant design? Is that too much to ask? This is the point many people on this thread are making. Chucking more and more RAM at a problem doesn’t make it go away.
You’re right; Throwing more ram at it doesn’t make it go away.
But doing so is the cheapest and best option you have if you refuse to upgrade.
Or see my programming item above. Learn how to program modern applications on older hardware, and perhaps you’ll make a small fortune.
My guess though is that those who aren’t willing to spend $50-$60 on a major ram upgrade also aren’t willing to pay you enough for your optimized software to make it worth your while though. And that’s yet another reason why programmers are not focused on older, out-of-date systems: There’s not a lot of money to be made by doing so, when compared to selling to those who can afford that $50 memory upgrade, or that $400 computer.
This whole debate is like a bit c*ck waving contest
“oh yeah! I run Slack on a 286 with 4 megs of ram”
Geez…
Linux always felt a little slow to me, but I’m running SuSE 9.1 on a 450 with 256 and it seems acceptable compared to my Windows 2003 server on a 600 with 300+ megs of ram.
windows on pre 1999 hardware looked crap
but you expect the same hardware to run either kde or gnome with all bells and whistles ?
ICEWM is what you should use on hardware like this, simple as that… it will look like windows, but will run faster…
also, cut out services you do not need to run
sort yourselves out !
NEWS FLASH: has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE’s?
Users are guilty of bloated software because they are lazzy to learn command line and they like cosmetic features not needed.
Exemple ? I can use mv, cp or even mc to copy and transfer files in linux, but “joe” users prefer to use konqueror or nautilus, that are bloated because they are web browsers, preview files and images, etc. This is nice but it is not necessary. The same occured when Internet Explorer was merged into Windows. Windows 98 and subsequent windows are much more bloated.
Stop with this insanity of trying to make multiuse applications. A filemanager doesn’t need to be also a browser and a browser doesn’t need to be an email client. I say the same to Windows developers !
My Win XP Pro needs 40mb Ram after booting.
With mIRC and IE it gets near the 64mb mark.
FC2 would be unuseabel on the laptop i have, because with 192mb it is at it’s limit. And XP runs like a charm on this 600MHz/192MB pc.
So, seems (almost) everyone sees some kind of problem. Out of the box, on all but the very most recent and expensive hardware, it seems that no ‘noob’ is going to get a happy Linux Desktop experience.
What is the solution?
How to motivate people who donate their hobby efforts for free to keep an eye upon being careful with resources as well as competing with Windows and OSX apps for featuresets?
General awareness and the praise of those who write neat stuff efficiently might help?
Someone somewhere famously described perfection as ‘when there is nothing left to take out’. How to arrange for developers to ‘take out’ and improve new stuff instead of add new stuff? Awards for those who do to be an example for others to follow?
I’m a software developer and I’ve been using computers as far back as the Apple ][ days and I must agree, it seems current desktop environments are getting too bloated.
I’m a staunch supporter of Gnome (as can be seen in my previous posts), but I was so dismayed to see how slow its subsystems ran (menu, starting applications) on a Celeron 600 machine with 128MB RAM and an i810 video subsystem. I remember Windows 95 and X with Windowmaker running on a P2-400 with 128MB of RAM which was snappier than that. And from the base install of the Gnome DE on FC2, what functionality was it offering that wasn’t in Win95 or WM? Not much. And yet it was molasses slow on a faster machine.
The point is, even if those two DEs are different, for what they do, they should be comparable in speed. In a stripped-down configuration, Gnome is still slow compared to those 2 previously mentioned DEs.
I haven’t tried KDE, but this is not about Gnome vs. KDE. It’s about Gnome vs. Gnome (or KDE vs. KDE).
As a software developer, I’m very well aware of the need to balance Speed, Size and Simplicity (the 3 axis). Speed, I’m afraid, is getting ignored. Perhaps it’s difficult finding ways to speed things up, but one thing that shouldn’t be denied is that for all that horsepower (Celeron 600 with 128MB RAM) and relative to what current DEs do, Speed has been left on the wayside.
has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE’s?
X is not fat/slow. If you simply want to run X you can do so on a 286 with 4m ram.
Gnome (in the 1.x days) was wonderfully fast. Many times faster then its Windows counterpart. KDE 3.x started faster than KDE 2.x and has actually gotten even faster as time has gone on. Gnome 2.x is an embarrassment. Its slow, getting slower, AND reducing features.
brockers
Well, since this discussion is going nowhere, here are my top tips for today!
Find your init script, in Arch this is /etc/rc.sysinit. Comment out:
* /sbin/ldconfig
For some strange reason some distro guys think you need to ‘update shared library links’ every boot. Removing this presents no problems and cuts a few seconds off boot time. Maybe you should run this manually after an upgrade or something 😉
In your shutdown script, on Arch /etc/rc.shutdown:
* /bin/sleep
After SIGTERM and SIGKILL is sent to all processes, I have to wait 3 and then 5 seconds. Really! I’m an impatient person. I don’t /care/ if some lazy dangling process gets killed in the middle of shutting down. Any applications which were accessing data important to me have already been closed anyway.
From your list of servers, cut them down! (DAEMONS in /etc/rc.conf on Arch) This is of special interest to commercial distro users, such as Mandrake, Red Hat, SuSE, which normally come with quite a few useless servers enabled by default. Nobody I know needs cron on a desktop, or inetd/xinetd or any internet server for that matter (maybe sshd?). For example, I have only lisa and kdm.
Keep a copy of your init scripts just in case an update clobbers them.
Another thing: If you know how to compile and install your own kernel, grab the latest from kernel.org, configure with only the drivers you need, and compiled-in rather than modules. Now you can comment out the module loading code too (and ‘updating module dependecies’ /sbin/depmod). On Arch remove the kernel26 and kernel24 packages so that your custom kernel doesn’t get overwritten on update!
Still though, this doesn’t make KDE or GNOME any faster 🙁
Good post Mr Banned, even though we do disagree. I can see your point though. However, I think you’d be spot on if the bloated apps really did NEED all the resources they consume. That’d be fine. I don’t have a problem with resource-hungry apps when they’re necessary – eg heavy-duty scientific work and games. They make use of resources for a reason.
However, gconfd taking up 12 megs (resident) is just sloppy programming. GNOME needing 128M to run comfortably is equally bad. It’s possible to create integrated, smooth and friendly desktops in a 10th of that RAM, and the features and ‘productivity’ GNOME provides doesn’t match it.
I can understand an office suite needing 64M. I can understand a browser using 32 (with lots of tabs/windows open). I can understand the desktop using 16 or possibly 24. But in the case of these apps, they’re all using a lot more for no real gain. If GNOME used the 128M it needs with some incredible stuff, that’d be worth it, but it’s barely any more advanced than, say, Win2k’s desktop, and yet needs more.
That’s what I’m getting at. Munching resources is fine when necessary, but it’s bad when it’s down to lazy coding.
It’s not Linux or the distributions as such. The applications and/or the underlying libraries (X, KDE, Gnome etc) are what eats memory. I have on a 1/3 GB machine the following top ten memory users:
SIZE RSS SHARE COMMAND
61372 52M 21756 galeon-bin
39648 32M 15092 evolution
109M 27M 5160 X
17052 15M 9080 rhn-applet-gui
12148 11M 1372 mdmpd
12964 10M 3796 emacs
15760 10M 8624 nautilus
11488 9M 8328 gnome-panel
8764 7776 6652 gnome-session
7908 7124 6460 gkb-applet-2
Galeon: 52M for a browser? Where’s all that memory gone?
Evolution: It’s a mail client, is it caching every mail I read?
X: Here’s the biggie. Why does X take over a hundred MB? I have a simple theme, 5 windows open, 1280×1024… is it buffering like a madman or leaking like a sieve? I do not know.
The rest are more reasonable, as they share most of their memory (probably GTK/Gnome libs), but the top three are serious offenders. Even so, the machine feels pretty snappy, except when I’ve done huge file reads and everything is swapped out.
But what’s up with X, Galeon and Evolution?
-Lars
“Linux is not getting fat. Fedora, or any other distro with those requirements are. Keep the word Linux in context with the kernel and we are a lot less troubled. If you choose to run KDE/GNOME2 and then add GDM, and all the bells and whistles (gdesklets for example)… expect to use some ram up.”
“Fedora has some steep requirements, and suddenly the “Linux platform is getting fat?”
As a desktop OS, Linux needs an easy-to-use interface. GNOME/KDE are those interfaces. The point is, campared to the competition’s OS interfaces, Linux’s suck. You can try to twist it how you want. But that is the truth. Linux is nothing (for the desktop) without GNOME/KDE. And GNOME/KDE are way too slow compared to Explorer or OS X.(Period)
NEWS FLASH: has X plus *nix apps (feature set vs feature set) ever used less ram then the windows, beos or apple DE’s?
Nope, and never will be thanks to so many bloated layers that comprise the gui of almost all unixes.
brockers is pretty much right though, KDE got the architecture right and so has been able to concentrate on optimizations, while Gnome stumbles around, seemingly always having potential, but continues to be slow and never knows in what direction it wants to go.
Of course KDE is not without its problems too. The interface needs trimming(much easier than coming up with a whole new component technology for Gnome), and it relies on QT which is a decent toolkit, but because of licensing issues will always be a non-starter for many people.
Things on the linux desktop could’ve been so much better today if certain historical events hadn’t happened, e.g. QT was a community project, Gnome had never been started, we had one unified desktop for Linux. Oh well, you people got your “choice”.
How about Outcome 4: You’re a foolish boy?
KDE and Gnome will keep getting bigger, get used to it. But they will still be skinnier than Windows.
“Outcome 1: KDE and Gnome can keep adding more features and bloat with little to no consideration of performance. Mono or maybe Java will be integrated into these DEs to make rapid development possible with a tremendous performance hit. People who don’t need all the fluff will switch to a distribution which offers an alternative desktop environments as the default. Gnome and KDE will lose popularity.
Outcome 2: KDE and Gnome will reach a “feature plateau” where it is comparable with Longhorn and then buckle down and do some serious optimizing. With Novell backing Gnome and trying to replace Windows in a corporate setting, this is looking more and more likely.
Outcome 3: People get fed up with this “Linux thing” and BSD becomes the trendy OS du jour. FreeBSD is looking very tempting right now for my server. “
There has always been the argument of whether throwing more hardware at the issues is a correct way to handle the situations. Personally as an avid linux user with years of tech experience, I tend to not like the idea of throwing more hardware at a problem but find myself doing just that most of the time. This slackware machine I type on runs x-kde. It’s super fast. But then again I through 2Gigs of quality ram at it, added a 3+ GHZ P4 and recompiled the kernel specifically for the hardware. HTT, Highmem, etc..
While I find that it’s true that the coding is getting a bit messy and bloated in certain areas, I also find that Linux in general will run fine on the requirements mentioned minus X and glamour packages. I don’t believe that anyone has mentioned that Linux was ready to compete with MicroSoft Windows as of this date, so I’m taking it that all is a work in progress.
My conclusion is, you’ll need to learn Linux to use Linux regardless of code bloat. Generic kernels in every distro are compiled with bloat. That includes slackware. A faster kernel helps speed things up quite a bit. With all the hardware requirements out there, linux distibutors have no choice but to compile generic kernels as they do.
I just hope that this article of yours brings some sanity in open source developers and they understand the gravtity of this situation. I have been using linux for over two years now. But everytime i upgrade it becomes slower and slower and then my inclination towards windows increases. Good article though. Someone had to take the initiative and tell linux not to bloat.
1) Is your X server using an accelerated driver, or the framebuffer device, or even the generic vesa driver?
2) If you are using an accelerated driver, which one? Some provide more acceleration than others.
3) Are you using anti-aliased font rendering? If so, did you check to see whether your driver supports hardware acceleration of the RENDER extension?
4) Did your friend disable unnecessary background processes, or did he just do a “full” install so he didn’t miss out on any goodies.
Finally, users don’t want fast machines that do nothing, they want machines that perform some useful task. For years, the calls were for “usable desktop applications”, tools such as xpaint, xfig, midnight commander and Lyx + latex being judged as being “unsuitable”. Well, now we’ve got the kind of fully-featured applications that were being called for, but in order to create them _in reasonable amounts of time_, and with a reasonably high level of reliability, reusable component architectures (e.g. GTK, DCOP, Qt, etc) need to be used.
As the motto goes – “Good, fast, cheap – pick any two” (where “good” in this case means “efficient”, “fast” means “available now rather than in 10 years time” and “cheap” still means low cost). The mass market appears to have decided that it likes “Cheap” and “Fast” – just like with PC hardware, in fact.
If you think there’s a market for “Good” and “Fast”, go right ahead and try to make some money doing it.
How come people say Linux as a kernel
and GNU/Linux as a OS ??? Now people want to
be like Richard Stallman ???
And since when people run Linux kernel alone
by itself???
Where are you getting the memory figures from? If they include disk cache, then it is a little unfair. On Linux disk caches accumulate until the memory starts running low. It is quite alright for it to then deallocate big blobs on unaccessed disk cache to make room. I can imaging a browser may have a lot of its internet cache in memory, which is quite reasonable if the memory is not being used for anything else.
Erm, I’m not trying to apologise for big apps and libraries. It really does disturb me that kdelibs is a few hundred megabytes. I can’t imaging how it is possible to write so much code.
“Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users — the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don’t pay any attention to speed, and it’s giving a lot of people a very bad impression. Spend an hour or two browsing first-timer Linux forums on the Net; you’ll be dismayed by the number of posts asking why it takes so long to boot, why it’s slower to run, why it’s always swapping. Especially when they’ve been told that Linux is better than Windows.”
Doesn’t that just contradict the original point that the article was supposed to make? Let me explain something to you…
REDHAT IS NOT LINUX. MANDRAKE IS NOT LINUX.
I post this hapily from my work PC, running Slackware 9.1 and Gnome 2.6, with a P3 450 and 256 MB of RAM without any problems… This is a 5 YEAR OLD PC! 5 YEARS. Are you guys living with cavemen and their 486s? Go run frickin’ BeOS which died years ago. Experience its fascinating “modern feature set” that paved the way for today’s “multimedia platforms”.
It’s ridiculous and inane to think that a desktop in 2004 should run like Windows 95. The feature-set of Gnome greatly surpasses it.
Gnome 2.6 is no more “laggy” than Windows 200 or XP, the platform that you complainers wish for it to “emulate”. Get a clue, and stop being cheapskates with hardware that is half a decade old. You should be forced to run some unusable thing like Blackbox or the x11 window manager in your Purgatory for being such fools.
If you want to troll with more idiotic OSNews articles, then be prepared to get trolled replies. I don’t know anyone takes you clowns seriously. Linux isn’t getting “fat” unless you consider the ridiculous Fedora Core 2 to be “Linux”.
People who used to point out the lightweight nature of Linux didn’t seem to realize that it was not an inherent trait but just the current state of evolution. There is a natural progression in development. It is healthy to occasionally forget about optimization and concentrate on functionality, even if that means lower end machines are left behind. Once the functionality gains have been made a consolidation phase can begin to reduce the overhead and solidify a new baseline. We’re approaching a time where such a phase should (and will) begin. Windows is further along in evolution, but over the longer term Linux and open source will win.
Recently, a friend of mine expressed an interest in running Linux on his machine. Sick and tired of endless spyware and viruses, he wanted a way out — so I gave him a copy of Mandrake 10.0 Official. A couple of days later, he got back to me with the sad news I was prepared for: it’s just too slow. His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower….
Now, I’m not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I’m just being realistic — they should still run decently on hardware that’s a mere three years old, like my friend’s machine.
Ok so who does he think he’s fooling? I have a similar spec machine and it was from 5 years ago, and it was a low end machine then. To top it all off, I run Linux on it, and it’s not really that slow at all. I use WindowMaker with sylpheed, firefox, rox, nedit, mplayer, and some aterms. I could load XP on this machine if I wanted to but it would be pretty slow. Bootup takes about 5-10 seconds. The init scripts add about another 15-20 seconds and then I can login. It doesn’t bother me much at all. In fact firefox and eclipse are the only things that take a long time to load. It takes firefox like 4 seconds to load, and it takes eclipse around 10 seconds.
Redhat and Mandrake are not.
I run Gentoo + kernel 2.6.5 + GNOME 2.6 + OpenOffice 1.1.1 + Firefox 0.8 + Gaim 0.77 *at the same time* on my PII366 laptop with 192MB ram with no problem. Some time I watch DivX movie in full screen on it.
I did tweak the kernel and /etc/init.d/* to be fast and I did use ReiserFS.
But I still agree with the author that open source coders need to focus more in memory usage and cpu time than adding features.
109M 27M 5160 X
X: Here’s the biggie. Why does X take over a hundred MB? I have a simple theme, 5 windows open, 1280×1024… is it buffering like a madman or leaking like a sieve? I do not know.
109M is including your mapped gfxcard-memory!
“Ok so who does he think he’s fooling? I have a similar spec machine and it was from 5 years ago, and it was a low end machine then. I use WindowMaker with sylpheed, firefox, rox, nedit, mplayer, and some aterms.”
Er, did you even read the article? Those apps are NOT the solution for newcomers. None of the major desktop distros provide them as default software, and they’re not as familiar and easy as the larger counterparts.
The writer stated that he knows these apps exist; however, light apps exist on any platform. He’s talking about the WHOLE PACKAGE that newcomers see — and the apps being pushed as alternatives to Windows. Newcomers don’t want WMaker, nedit and Sylpheed. They put in a Fedora/SUSE/Mandrake disk and want to use the familiar and featureful apps that are provided by default.
And these apps are getting extremely slow and bloated. That was the point. You’re basically saying people should go back to Windows 3.1. Why? Why shouldn’t newcomers be able to just use a modern distro without it being slower than WinXP? Why should they have to change the familiar desktop and apps into lesser-known and less-featured ones just to get it running at a decent speed?
Maybe somebody already said that but although I use Linux since Mandrake 7.x I have to agree with writer.
I personally changed my old PIII 800/512MB PC last November with a brand new PIV 2.6HT /512MB RAM, same ATA 133 Disks, 7200 RPM.
Well, I was disappointed. While Win2k/WinXp gained considerably speed, Linux was not. Not RH9, nor FC1 were able to go even cose. I use many apps at a time but no swap so far.
Only these distributions are painfully SLOW. I am a user. I do not even care what a Kernel might be, although I have compiled many w/o speed improovments. In my comparisons with the tears in my eye I have to admit that windowz plays better and faster. My Brand new PC is sad. The Penguin is not running with much difference speed compared to the previous one. Maybe developers are losing control on their creature, becoming too complex.
– Bye,
Paolo
“X: Here’s the biggie. Why does X take over a hundred MB?”
Actually, it doesn’t. That’s the video card RAM being mapped. X itself is quite small; I’ve run XFree86 4.2 on a 486 before, and it’s usable. It’s the huge desktops and apps that are sucking up the RAM though, as you rightly point out.
There’s nothing stopping users migrating from Win98 from installing RedHat 7.3. It’s monumentally faster than newer Linux distros. The only sacrifices you would make would be some internationalization that has been added to newer RedHat distros and there would be a few insecure packages that you’d have to rebuild by hand.
However, these minor security concerns are dwarfed by those encountered when upgrading to 2000/XP instead of RedHat 7.3.
The writer stated that he knows these apps exist; however, light apps exist on any platform. He’s talking about the WHOLE PACKAGE that newcomers see — and the apps being pushed as alternatives to Windows. Newcomers don’t want WMaker, nedit and Sylpheed. They put in a Fedora/SUSE/Mandrake disk and want to use the familiar and featureful apps that are provided by default.
And these apps are getting extremely slow and bloated. That was the point. You’re basically saying people should go back to Windows 3.1. Why? Why shouldn’t newcomers be able to just use a modern distro without it being slower than WinXP? Why should they have to change the familiar desktop and apps into lesser-known and less-featured ones just to get it running at a decent speed?
WHAT?
That’s just ignorance on your part. WindowMaker is more functional than the XP shell in itself. It may not be as pretty but it does a hell of a lot more. On that same note, nedit is more featureful than the software included with KDE or Gnome. Rox and Sylpheed do everything you need them to do. Most people don’t need Evolution, especially when they are just using it for email. Rox is fast, lightweight, and incredibly easy to use. Mplayer is a standard video player, so I don’t know how you can agrue against that.
I read the arcticle, it was just stupid. How can you say, “The Linux Platform is Getting Fat” when you really mean, “Fedora with Gnome is getting Fat”. Don’t name articles something completely different than what the subject matter is, it’s just inviting a flamewar. KDE on Gentoo with a 650Mhz processor and 128MB of RAM is perfectly usable, I know from experience. Every machine I have is 700Mhz or less and they all run Linux without a hitch. Windows was a mess on those machines. Sure they were fast out of the box, with WinME/Win98, but a few months of use made them dog slow. I don’t want to reinstall an operating system because it can’t even manage itself for more than a few months.
always forget there are countries that buy all ur first-world old hardware ; here 128 MB / 500 mhz its still normal for offices and homesystems – plus people dont have too much time to spend learning a new operative system
Just see that G boy… typical Linux user… idiot that is..
LTSP turns old computers with only 32mb of ram into zippy boxes. It can run FC2, Mandrake 10. Suse 9.1 etc. How do they do it? heh, LTSP turns old boxes into thin-clients. Check it out.
I love people’s solution to this problem. Upgrade! Buy more ram! I thought alot of the market linux was targeting were computers that used to run windows, that no longer meet the minimum requirements with companies caring about the bottom dollar, not having to upgrade their hardware being a big issues. I don’t expect a third world country to “go out and buy more ram”, and I think alot of the companies out there that are thinking of switching aren’t looking to do so either.
Does ReiserFS offer good performance as a Linux root partition vs. Ext3? I may consider converting. I heard it was good for small files, which are plentiful in the Linux world! Might keep my video partition as FAT32 thought 😉
Lock this thread now. Please.
I think the OSNews.com staff should reconsider using a subscription model to post on OSNews.com :S.
But anyways. I still find it hard to believe people keep bringing up the “Linux is not an OS” thing. Of course Linux ain’t an OS, but for the newbie, it is! Accept that damn fact, not everyone is as educated in using PC’s as we… well, as some of us are.
Unbelievable.
“KDE on Gentoo with a 650Mhz processor and 128MB of RAM is perfectly usable, I know from experience.”
Ain’t that the truth. The box my father uses is a 533MHz Celeron Emachines(Yikes!) with 192 MB RAM & Voodoo3 16MB video card. KDE 3.2.2 runs quite well, with all eye candy turned off and light themes.
I don’t think GNU/Linux system’s Desktop Enviroments ie Gnome/KDE are slow due to programmer negligence, instead they are slow by implementation.
Gnome to render a typical application, most than likely requires Xlib, GDK, GTK, gnome, pango, and maybe glib libraries. A KDE application for a similar task, Xlib, Qt, and KDE libs.
Windows programmers, typically do not deal with this many layers of abstraction.
The X server technology is more than a decade old, hopefully freedesktop, with their new xservers and authority will cut down on these multiple layers of abstraction. With a newer cleaner Xlib design.
There was a time when my 40MB Compaq LTE5150 laptop would run RH 5.x or 6.x. I had 1.2GB disk and I could just pop the CD into the thing after booting from a floppy and fire up the install and I would have a decent Linux box ready to go. Oh, I did have the usual problems with X because it didn’t auto config my Compaq display correctly, but other than that it worked.
I recently tried to load one of the more modern RH ( 8.0 ). First, I couldn’t easily select a workstation install because once it selected all the default packages my disk was no longer big enough after configuring root and swap! I had 800MB of /usr space and it wasn’t enough. Secind, once I trimmed down the installation and installed it, 40MB was just not enough. Hell, the X server was 38MB! So I can’t use Linux on my old laptop anymore unless I revert to an old version distro or run some stripped down distro.
I think the existing distro companies are in trouble because of this. They obviously don’t consider this an important aspect to their survival. I think its an opportunity for all the lean distros to get out there and provide a solution.
I agree — this thread should be locked….way too many replies and way too much crap not to lock it.
Unfortunately, this article is not a troll, as some people seem to suggest. On my machine, an Athlon XP 1700+ with 256 MB RAM, KDE feels too little responsive quite often and starting OpenOffice also takes several seconds which is enough to make it feel “slow”. And I’m running Slackware here. I have also done some tweaking in order to improve speed. I think it is about time that someone tried to unify all the different toolkits in order to create just one that could be used by _most_ desktop users and that all applications would switch to it eventually. I think it is a necessity in order to market “Linux” to the masses. Similarly to what several people wrote, I have quite often had trouble trying to convince people to try out Linux, after they saw how sluggish it seemed compared to windows. Also note, that places which would be really good for establishing a linux stronghold, such as schools or charities often have and use obsolete hardware – which won’t run any modern Linux DE and be usable. I think improving performance is a technical necessity for linux nowadays. Even longhorn is going to have a “legacy” mode without all the bells and whistles allowing it to run on reasonable hardware. The guys at Microsoft do not want to commit suicide after all. Hope the FOSS movement doesn’t either.
I have an old Dell laptop (233 P2 with 140 RAM) running windows XP and it runs just fine if you turn off the theme manager. I can run Winamp, Office, Firefox and Thunderbird all at once without a problem.
>I don’t believe all what you say, not one bit. I use XP >on a 256 MB machine as well as many others on an Athlon->XP 1.3 GHz and it runs great. Either your installation is >hosed, or you blatantly lie.
>Typical response of the astroturfer. “You must be doing >something wrong.” It’s not Windows, it the user. Your >other choice is just downright insulting. If you can’t >defend the product, attack the consumer. The system is as >described, it is properly installed, and I don’t >blatantly lie.
I agree with the earlier poster though I’ll be more blunt, you’re a linux lier. We run Windows XP on 256MB without any issues. Have done since it came out.
@Brian.
“I love people’s solution to this problem. Upgrade! Buy more ram! I thought alot of the market linux was targeting were computers that used to run windows, that no longer meet the minimum requirements with companies caring about the bottom dollar, not having to upgrade their hardware being a big issues. I don’t expect a third world country to “go out and buy more ram”, and I think alot of the companies out there that are thinking of switching aren’t looking to do so either.”
It is an option Brian, as well as learning to use Linux. Linux is designed to use memory. Some forget this. Even with 2Gigs of Ram on a machine you will find that Linux manages to use a majority of it in one fashion or another. Especially when loading multiple large apps. The more Ram you have, the merrier with Linux.
Some of the problems people have with this stem from their knowledge of the Windows operating system. Where an over abundance of Ram in was not necessarily a good thing.
I’ve converted this machine to a dual boot slack/winxp pro machine to test the responsiveness of WinXP and Linux with an above average amount of memory.
Seriously I do not believe that WinXP Pro is utilizing and managing the memory as it should and I fail to see true responsiveness gains from XP due to increased ram. On the other hand, with Linux I can say that I’ve seen excellent gain in both responsiveness and speed with an above average amount of RAM. Therefor, yes. Purchasing more ram for use with linux (this is kernel 2.6.6) shows a valuable upgrade and performance gain while Windows XP Pro does not on the same hardware. Which makes for a viable option if the funds are there.
The notion that Linux was designed to run on specific lower quality hardware, or a machine with a lower amount of ram is not accurate. Linux is moving. It’s not stagnant. Of course Linux as it progresses will use more memory. That is inevitable as it is with any software on the planet.
I’m in no way saying that you need 2Gigs of Ram to run the linux kernel. But some distro’s like Fedora may like it. Most machines I have have 512M or less and do well for the most part. But there are advantages with Linux and an increased amount of ram with the increaseingly larger package sizes. This requires a kernel recompile as it would for HTT etc….
I prefer the way Linux manages memory and has a “I’ll use it if you give it to me” approach over my windows eXperiences.
These are just my observations.
Yes, ReiserFS is fast. It uses B+ Search Tree to manage files. Check this out for comparison:
http://www.namesys.com/benchmarks.html
Windows 2000/XP, in my experience, needs at least 256MB to
be really usable. I had to use a Win2000 machine with only
128MB of RAM, and the disk spent over half the time
thrashing while I tried to get some work done.
I finally complained to our workstation support, and
he was able to scrounge up 256MB more RAM, so now at
384MB the machine has crossed into the realm of “useful”
versus “throw out the window frustrating”
I built a WinXP box (Athlon 2000+) for my wife’s
grandmother with 32MB onboard video and 256MB of main RAM.
I wish I’d sprung for 512MB, because I find this machine
spends most of its time swapping as well, but that’s when
I’m using it – I tend to task switch a lot, and I spend a lot of time waiting for my new window to move from back
to front, and there are often inexplicable delays where
nothing seems to be happening.
My main home system and my work laptop run WinXP,
both ar P4 systems with 512MB of RAM, and both are
mostly responsive enough such that multitasking is
easy. I run WinXP on the laptop because I have to.
At home the WinXP system also boots Gentoo Linux using
kernel 2.6.6, and for the most part it runs KDE 3.2.x
for the benefit of my wife (I like XFCE4 better).
Apples to apples on the same machine, WinXP is somewhat more
responsive on application startup than Linux, but task
switching is more consistently responsive on Linux,
especially running kernel 2.6 versus 2.4.
My wife is perfectly happy using Linux with KDE
for her e-mail and word processing, though she
probably wishes OpenOffice started up faster.
I like some of the eye candy of KDE, but I tend
to gravitate towards XFCE4 because I’m more of a CLI type.
My other home machine is a PII 300Mhz with 288MB of RAM.
It exclusively boots Gentoo Linux with kernel 2.6. It
has KDE installed, which runs just fine, but once
again I prefer XFCE4 because of its light weight.
I tend to run shell based apps on the PII box, because my
wife is usually using the other machine, but when I need
something with a GUI like Opera or Mozilla, I tend to
ssh into the faster machine to run those apps. The PII
can run these apps just fine, I just tend to work faster
than the PII can react.
I also installed Gentoo with kernel 2.6 on my wife’s
grandmother’s old Compaq PII 233Mhz with about 128MB
of RAM. The system was usable, though
slow to load apps (due to _very_ slow hard drive).
If my wife’s grandmother did not use AOL on dial up,
I would have saved her the $1000 it cost to build her new system and simply given her this system.
Because I have invested myself in learning how to install
Gentoo Linux, I have no need for the more newbie friendly
distros, even though I have installed them several times
to see if any can pull me away from Gentoo, but none ever
have been able to. I also keep a Knoppix 3.4 CD handy
in case of emergencies. If Gentoo didn’t exist, I would
use Knoppix in a heartbeat.
The beauty of Linux is that it *can* be configured to run
on any hardware, but you have to know what you are doing.
The less people know what they are doing, the more
bloatware needs to be included to cater to their skill set.
Windows 2000/XP and the commercial distros fall into
the category of bloatware, currently.
The speed improvements in KDE 3.2 and the kernel 2.6
work that seems to be going on to address system responsiveness tell me that this is not going to be a
problem for the commercial distros for long.
Longhorn is another story.
I will be setting up my dad’s old 200 pentium machine… 3 gig hd, 16 meg vid ati card… it has 32 meg ram. and it has 64 now, but on purpose i am trying to downgrade it. I want to make it a project box to see how to make it perform speedy. I will probably compile everything myself in gentoo. I will use twm, and only a very few basic apps. I will be exploring options that are the slimmest I can find. If anyone has ideas, please feel free to email me at [email protected]. The idea is to have a window manager, 2.6 kernel, sound working(awe32), office suite, and a few other things.
It really had to be said, this is so true! WHat ever happend to tiny apps? Rember all of those great DOS games like Descent II and Warcraft? I’ve seen games with similar graphics run in SDL with a horrible frame rate on brand new machines, while the DOS counterparts flew on 486s.
The apps will run on that system, but they will be intolerably laggy. Compilation alone will take about 1-2 weeks just to complete.
You might be better off focusing your energies elsewhere.
To defend KDE:
I don’t think I read a single review of KDE 3.2 that didn’t say it felt faster than KDE 3.1. What do you want from them? To make it unbelievably fast and still provide a full desktop that even includes a sound mixer?!
You don’t need 128MB of RAM. I ran (I’ve since bought more memory at the unbelievably low price of $15) a laptop with 64MB of RAM, 8 was shared to video. Yes, I did run floxbox instead of gnome or KDE, and something like xfce would have worked well too while being more userfriendly.I could work in abiword while browsing the net in firebird and talking on Gaim. Yes, it’d use 80MB of swap but it wasn’t all that slow. It felt like a machine that was a bit short on memory, I also used it as a Win98 machine and it was certainly better off this way.
I run a PII 350 with 256MB of RAM. I must say that’s plenty of RAM as the machine never seems to be low (I of course don’t do graphics manipulation on it). Once again, I have extensive experience with the same machine on Win98, although it was one abused install, and it’s much better now: It doesn’t fill up the hard disk with temporary internet files.
You really can’t complain about the memory use issue compared to XP as XP is 3 years old. Things will slowly use more and more memory, and I for one don’t see it as individual apps doing it as much as it is users wanting to do more at once.
Your buddy could speed things up a lot by simply using konqueror over mozilla. Mozilla is a memory hog, I’ll agree on that one. KDE sucks up a lot of memory, but it’s tradeoff is that it’s a complete environment that looks very nice by default. I would also hope that in the KDE setup he turned off all the fancy animations, like XP would have done for him.
I’m all about clean code, but I’m not seeing the overall problem you are. Things are going to use more and more resources, that’s life. Linux can’t run well on a 386 forever now can it? And 128MB of RAM was not a proper amount to install on a machine in 2000, and it’s still too little today. Maybe you should blame computer distributers for being cheap on RAM? My store does it too, but I always get customers to upgrade their memory. It’s important to have more than 128MB of RAM with Windows XP because it’s ungodly slow with that amount of RAM. You expect a new machine to feel quick….
I’m admin for an school and I have for several years work towards linux at school. One off the arguments was longer lifespan of the computers. No need for upgrading the hardware, stable, free of charge. This year I get an “Go” from the managment and this spring we installed fedora with xfce. The result ? All the students say the same thing. “It so slooooow”. And it is. Over the summer we probably fix it with an other distro, byt my students can’t fix it on their home computer. Its a shame.
I cannot believe “His box, an 600 MHz 128MB RAM system, ran Windows XP happily, ” : )
And I think FC2 should be optimized for 586,if not 686
…I get an “Go” from the managment and this spring we installed fedora with xfce. The result ? All the students say the same thing. “It so slooooow”.
Didn’t you even test the distro you were using on the computers before recommending it? Stay away from Fedora, please 😉 If you know how, knock up your own distro or modify one like Arch, Slackware, Debian, etc. What spec are the computers?
byt my students can’t fix it on their home computer
That is a problem. Have your recommended that students install a certain distro at home? They might be better off sticking with Windows and give them Win32 ports of the apps they use. Or you could look around for less mainstream distros, there are a few around that work a lot better than the usual commerical dogfood.
Not really. If you buy a machine that is ten year old technology, use a ten year old OS with it, like Win95. That’s what was meant to run on it.
FC2, XP and other modern OSs and distros have gained more features (like image previewing in file managers, XML file formats, WYSIWYG word processing/web design, high resolution 32bit displays and large-scale image editing, network transparent file access, accessibility tools, internationalisation support, etc.). We as people want these things (“What, my OS only speaks three languages, so my children can’t understand it?”, “I’m visually impaired, and there aren’t any tools to help me use a computer?”) and we’ve got them in modern systems.
These things are important and worthwhile. FC2 and others have added these features, and you can’t expect hardware to support every feature that comes out, perpetually, without an upgrade.
So, in short, either upgrade the hardware, if you want the features, or be satisfied with the software as you are apparently satisfied with the hardware and stick with the OS that you’ve got.
“…an 600 MHz 128MB RAM system, ran Windows XP happily.” That’s complete bullshit and if the author had used Windows XP with that configuration then they’d know that. I’ve used Pentium 4s with 256MB RAM and had them crawling with a few IE (cringe) windows open and some office software going. If you believe this article then that must be a fluke because Microsoft’s newest products are so amazing efficient.
It’s certainly true that KDE and Gnome have gotten quite a bit bigger in recent years but Windows XP is freaking massive itself, even when it’s just booted up without anything running. I’d also like to point out that my Pentium 3 at home which runs Gentoo starts up in noticeably less time than ANY machine I’ve seen running Windows XP (new or otherwise).
All I can tell from this article is that the author must be a Linux hobbyist at best. I’ve been using Linux for 5 years as well and somehow I can get a nice quick Linux setup with X going on my old P166 Thinkpad with 64MB RAM and they couldn’t seem to figure out a decent setup for a 600 MHz PII/PIII with 128MB RAM. Maybe the problem is somehow Mandrake or Fedora…or maybe I’m just magical.
Maybe if so many people have these horrible performance issues that the author speaks of then they should just stick with Windows. I’ll happily buy their old hardware for cheap and create more of my “magically” productive Linux boxes. Oh well, that enough of my opinions; hey, maybe I should write an editiorial too…