Consider these memory requirements for Fedora Core 2, as specified by Red Hat: Minimum for graphical: 192MB and Recommended for graphical: 256MB Does that sound any alarm bells with you? 192MB minimum? I’ve been running Linux for five years (and am a huge supporter), and have plenty of experience with Windows, Mac OS X and others. And those numbers are shocking — severely so. No other general-purpose OS in existence has such high requirements. Linux is getting very fat.
I appreciate that there are other distros; however, this is symptomatic of what’s happening to Linux in general. The other mainstream desktop distros are equally demanding (even if not as much as Fedora, for example Arch Linux or Slackware run Gnome on 128 MB, but not very comfortably when you load 2-3 apps at the same time), desktops and apps are bloating beyond control, and it’s starting to put Linux in a troublesome situation. Allow me to elaborate.
A worrying tale
Recently, a friend of mine expressed an interest in running Linux on his machine. Sick and tired of endless spyware and viruses, he wanted a way out — so I gave him a copy of Mandrake 10.0 Official. A couple of days later, he got back to me with the sad news I was prepared for: it’s just too slow. His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower. Not only did it take longer to boot up, it crawled when running several major apps (Mozilla, OpenOffice.org and Evolution on top of KDE) and suffered more desktop glitches and bugs.
Sigh. What could I do? I knew from my own experience that XP with Office and IE is snappier and lighter on memory than GNOME/KDE with OOo and Moz/Firefox, so I couldn’t deny the problem. I couldn’t tell him to switch to Fluxbox, Dillo and AbiWord, as those apps wouldn’t provide him with what he needs. And I couldn’t tell him to grudgingly install Slackware, Debian or Gentoo; they may run a bit faster, but they’re not really suitable for newcomers.
Now, I’m not saying that modern desktop distros should work on a 286 with 1MB of RAM, or anything like that. I’m just being realistic — they should still run decently on hardware that’s a mere three years old, like my friend’s machine. If he has to buy more RAM, upgrade his CPU or even buy a whole new PC just to run desktop Linux adequately, how are we any better than Microsoft?
Gone are the days when we could advocate Linux as a fast and light OS that gives old machines a new boost. BeOS on an ancient box is still faster than Linux on the latest kit. And to me, this is very sad. We need REAL reasons to suggest Linux over Windows, and they’re slowly being eroded — bit by bit. Linux used to be massively more stable than Windows, but XP was a great improvement and meanwhile we have highly bug-ridden Mandrake and Fedora releases. XP also shortened boot time considerably, whereas with Linux it’s just getting longer and longer and longer…
Computers getting faster?
At this rate, Linux could soon face major challenges by the upcoming hobby/community OSes. There’s Syllable, OpenBeOS, SkyOS, ReactOS and MenuetOS — all of which are orders of magnitude lighter and faster than modern Linux distros, and make a fast machine actually feel FAST. Sure, they’re still in early stages of development, but they’re already putting emphasis on performance and elegant design. More speed means more productivity.
To some people running 3 GHz 1G RAM boxes, this argument may not seem like an issue at present; however, things will change. A 200 MHz box used to be more than adequate for a spiffy Linux desktop, and now it’s almost unusable (unless you’re willing to dump most apps and spend hours tweaking and hacking). In those times, us Linux users were drooling over the prospect of multi-GHz chips, expecting lightning-fast app startup and super-smooth running. But no, instead, we’re still waiting as the disk thrashes and windows stutter to redraw and boot times grow.
So when people talk about 10 GHz CPUs with so much hope and optimism, I cringe. We WON’T have the lightning-fast apps. We won’t have near-instant startup. We thought this would happen when chips hit 100 MHz, and 500 MHz, and 1 GHz, and 3 GHz, and Linux is just bloating itself out to fill it. You see, computers aren’t getting any faster. CPUs, hard drives and RAM may be improving, but the machines themselves are pretty much static. Why should a 1 GHz box with Fedora be so much slower than a 7 MHz Amiga? Sure, the PC does more – a lot more – but not over 1000 times more (taking into account RAM and HD power too). It doesn’t make you 1000 times more productive.
It’s a very sad state of affairs. Linux was supposed to be the liberating OS, disruptive technology that would change the playing field for computing. It was supposed to breathe new life into PCs and give third-world countries new opportunities. It was supposed to avoid the Microsoftian upgrade treadmill; instead, it’s rushing after Moore’s Law. Such a shame.
Denying ourselves a chance
But let’s think about some of the real-world implications of Linux’s bloat. Around the world in thousands of companies are millions upon millions of Win98 and WinNT4 systems. These boxes are being prepared for retirement as Microsoft ends the lifespan for the OSes, and this should be a wonderful opportunity for Linux. Imagine if Linux vendors and advocates could go into businesses and say: “Don’t throw out those Win98 and NT4 boxes, and don’t spend vast amounts of money on Win2k/XP. Put Linux on instead and save time and money!”.
But that opportunity has been destroyed. The average Win98 and NT4 box has 32 or 64M of RAM and CPUs in the range of 300 – 500 MHz — in other words, entirely unsuitable for modern desktop Linux distros. This gigantic market, so full of potential to spread Linux adoption and curb the Microsoft monopoly, has been eliminated by the massive bloat.
This should really get people thinking: a huge market we can’t enter.
The possibility of stressing Linux’s price benefits, stability and security, all gone. Instead, businesses are now forced to buy new boxes if they are even considering Linux, and if you’re splashing out that much you may as well stick with what you know OS-wise. Companies would LOVE to maintain their current hardware investment with a secure, supported OS, but that possibility has been ruined.
Impractical solutions
Now, at this point many of you will be saying “but there are alternatives”. And yes, you’re right to say that, and yes, there are. But two difficulties remain: firstly, why should we have to hack init scripts, change WMs to something minimal, and throw out our most featureful apps? Why should newcomers have to go through this trouble just to get an OS that gives them some real performance boost over Windows?
Sure, you can just about get by with IceWM, Dillo, AbiWord, Sylpheed et al. But let’s face it, they don’t rival Windows software in the same way as GNOME/KDE, Moz/Konq, OpenOffice.org and Evolution. It’s hard to get newcomers using Linux with those limited and basic tools; new Linux convertees need the powerful software that matches up to Windows. Linux novices will get the idea that serious apps which rival Windows software are far too bloated to use effectively.
Secondly, why should users have to install Slackware, Debian or Gentoo just to get adequate speed? Those distros are primarily targeted at experienced users — the kind of people who know how to tweak for performance anyway. The distros geared towards newcomers don’t pay any attention to speed, and it’s giving a lot of people a very bad impression. Spend an hour or two browsing first-timer Linux forums on the Net; you’ll be dismayed by the number of posts asking why it takes so long to boot, why it’s slower to run, why it’s always swapping. Especially when they’ve been told that Linux is better than Windows.
So telling newcomers to ditch their powerful apps, move to spartan desktops, install tougher distros and hack startup scripts isn’t the cure. In fact, it proves just how bad the problem is getting.
Conclusion
So what can be done? We need to put a serious emphasis on elegant design, careful coding and making the most of RAM, not throwing in hurried features just because we can. Open source coders need to appreciate that not everyone has 3 GHz boxes with 1G RAM — and that the few who do want to get their money’s worth from their hardware investment. Typically, open source hackers, being interested in tech, have very powerful boxes; as a result, they never experience their apps running on moderate systems.
This has been particularly noticeable in GNOME development. On my box, extracting a long tar file under GNOME-Terminal is a disaster — and reaffirms the problem. When extracting, GNOME-Terminal uses around 70% of the CPU just to draw the text, leaving only 30% for the extraction itself. That’s pitifully poor. Metacity is hellishly slow over networked X, and, curiously, these two offending apps were both written by the same guy (Havoc Pennington). He may have talent in writing a lot of code quickly, but it’s not good code. We need programmers who appreciate performance, elegant design and low overheads.
We need to understand that there are millions and millions of PCs out there which could (and should) be running Linux, but can’t because of the obscene memory requirements. We need to admit that many home users are being turned away because it offers no peformance boost over XP and its apps, and in most cases it’s even worse.
We’re digging a big hole here — a hole from which there may be no easy escape. Linux needs as many tangible benefits over Windows as possible, and we’re losing them.
Losing performance, losing stability, losing things to advocate.
I look forward to reading your comments.
About the author
Bob Marr is a sysadmin and tech writer, and has used Linux for five years. Currently, his favorite distribution is Arch Linux.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
Well all im going to say is i think you should not be installing an up-to-date distro like fedora and fedora 2 on older systems (1+ year), if the hardware is older stuff then use a lighter distro or even an earlier version of the same distro, eg 300MHz with 64meg of ram would run fine on debian woody or redhat 7
my 2p
I definitly agree with the article when I first started using Linux it was great because it didn’t have demanding requirments and definitly made my system feel faster than Windows. As time went on this became less and less the case. At this point they almost feel the same running on identical hardware. It seems as though to me Apple has done the best job keeping system requirments low as OS X runs great even on old hardware.
Right now Linux’s problem is deffinitly the GUI. Hopefully it will become less of a hog.
“That’s complete bullshit”
I suggest you read all of the comments (yeah, it might take some time now!). There are LOADS of messages from people who agree with the article. Your experience may be different, but do read the comments and you’ll be surprised. Many, many people have found the increasing bloat to be intolerable, and WinXP nicer on such hardware.
Hey, it sucks that so many people are unhappy, but it’s a fact we have to face if Linux is going to become more popular.
The linux kernel + some basic libraries + bash runs quite happily on 486 with 8mb ram … even with modern distros.
What might cause the slowness and bloat: are there any unnecessary daemons (apache, mysql, sshd, exim … etc …) installed by default with mandrake? They might eat some RAM
Also, KDE or Gnome eats a lot of memory, if you want faster response under X, switch to lighter windowmanager (I personally use Blackbox, it eats about 2MB of RAM and is really fast even on 486 with 24 mb of ram – unless I run mozilla or similar bloated application of course). On newer (’bout 500 mhz or so) PC’s start of X server and blackbox is about 2 second. Starting KDE takes at leas half a minute or minute.
So linux applications are the problem, not linux and you can choose to use less memory hungry ones (use blackbox, windowmaker, fwvm …, instead of KDE or gnome, etc …) … or if you don’t want, just buy extra memory, it’s not so expensive
Is there going to be threading on commenting? There is no logical reason why there should be 320 comments spread over 25 some pages. How am I supposed to reply to a comment someone made 310 comments ago, without it getting lost in the shuffle? I am not even bothering to click into 31-45 comments, I am already tired of trying to find replies to me. Let alone the issue of seeing who is actually replying to ME.
This isn’t a flame, this is a rationalization of the status of the commenting here. You cannot have this many comments totally unorganized and expect anyone to benefit from debate.
This is the BEST osnews article I read this year. I agree with the author 100%. What the hell is happening. I remember my Pentium 166 running KDE fine back in 1996. Now I have a 3200+ Athlon system, and gnome terminal eating up my CPU power.
Spend hours tweaking and hacking. Oh come on, linux can be what you want if you spend hours tweaking and hacking. Just spend a week or two downloading & compiling the apps. you want. KDE/Gnome needs all that, but you can run a distro on a 200 Mhrtz 64 meg of ram system.
Linux is fast on old pc’s , I run Debian on a 8 year old p1 166mhz. I use it for chatting, email, browsing.
I don’t feel any need to use a faster pc for those things.
It just depends on using the software that is designed to be fast (doh). There’s Skipstone/backarrow/dillo to browse , sylpheed(/claws) for email, irssi/xchat for irc, abiword, gnumeric, scite , ted, etc.
The only weak point is the browser. Although Skipstone/Backarrow are good enough for normal users, they are not widely available as packages.
The guy said in his article he wouldn’t let his friend use fluxbox. I wonder why as it’s really easy to use.
And there’s always XFCE, that one is enormously easy for newbies , and fast in comparison too Gnome/KDE.
GNU/Linux is about choice, there’s no one solution for everything. The author of the article doesn’t understand that.
That M$ has discovered the way to ruin linux by making their programmers develop linux applications to consume cpu/ram/hd like the M$ apps do? I think so!
It’s free software, if you’re that interested in its performance learn to program and make some contributions (and see how good your code is). And if it’s not good enough try to buy something that’s better (hey, why pay $50 for more memory when you can get the super-de-duper Windows XP for $250?!)
And for the last time, when you run top, and X appears at the top of the list, you need to subtract the amount of graphics memory you have from the amount X reports to get the amount of main memory X is using. And if you don’t know what that means don’t even think of quoting “top” in a message.
And if I see one more person on these forums use the word “bloat” again I’m going to heave this PC in front of me through the window.
Seriously, get a clue. Well done Eugenia on bringing in a nice big audience of sub-slashdot adolescents. Sure who cares about the quality of the site when you can get a lot of hits.
“Also, KDE or Gnome eats a lot of memory, if you want faster response under X, switch to lighter windowmanager (I personally use Blackbox, it eats about 2MB of RAM and is really fast even on 486 with 24 mb of ram – unless I run mozilla or similar bloated application of course). On newer (’bout 500 mhz or so) PC’s start of X server and blackbox is about 2 second. Starting KDE takes at leas half a minute or minute.”
Most sensible post so far. Absolutely right. That’s the beauty of open source, is you can change GUIs, you are not restricted to using Gnome or KDE. You can switch a lightweight window manager, which is something you can’t do with Windows. You are stuck with what they give you. There will be no mixing and matching, and only a little tweaking. People still seem to have trouble grasping the real value of Open Source is in its versatility. If you know how, you can damn near anything with it. With closed source, you only get to do what they let you.
Linux command line great stuff. Linux for a server great stuff. Linux for the every day guy not such great stuff. People have a hard enough time with Windows. As far as resposivness, drop X and your problems are solved. There needs to be a ground up graphics system developed to replace X kinda like the Y window system or Aqua on MAC. One common toolkit give and one common desktop to give the industry something common to work off of.
Honestly, I agree with everything the author said and think that it’s about time someone actually said it. Thank you.
Quote:
Well with all this disscussion about rewriting gnome in java or c# to make it a more devl-friendly platform, its definatly going to get a performance hit.
The problem is not Java or C# (with JIT you loose little (~10%) over C code and could get an actual improvement if the JITers start doing a better job of optimizing for the current processor; something C compilation can not do for binary distributed code — what normal end-users need). The problem is really poorely architected code (initially or not refactored often enough) and just plain sloppy code. You can easily achieve several hundred percent improvement with well written code and carefully selected algorithm.
Actually, OS-9/6809 still lives, 20+ years on. The Color Computer (and clones) still has users and an annual conference, the Nth Annual “Last” Chicago CoCoFest, where N == 11 in 2002. A group of Canadian programmers rewrote OS-9/6809 Level II for the CoCo 3 (w/ address translation circuitry) for efficiency, and to take advantage of the native mode of the Hitachi 6309. Today’s serious CoCo users now typically have replaced the 68B09E in the CoCo 3 with an Hitachi 63B09E and run the rewrite, called “NitrOS9.” The combination is fast. Very fast. Very very fast. Especially considering it runs on an 8-bit CPU! Observers are usually astonished, as the benefit of proper (ie, cleanly engineered) operating system design is not widely known, and certainly not widely appreciated, among users of the commercially dominant operating systems (eg, Windows and OSX versions of the MacOS).
Well I am not going to wade through the 300+ comments but I did check /. which seems to basically be mirroring most of it. (Note threaded discussions are much easier to follow.)
What I want to argue is it is not the “Linux Platform” that is getting fat, it is the “Linux Desktop Platform” that is getting fat.
You can run a console based server with a very low amount of RAM and make use of it. As an example I have a firewall/bridge running a 2.6 kernel right now and free reports 48MB used, 6MB as buffer, 29MB as cache, and only 12MB active. Now I grant you it is not doing much right now but it is a baseline.
As far as GUI interfaces Gnome and KDE are both no longer targeted towards older platforms, point made. But there are other choices that are available. My personal choice is Slack running FVWM2 with a heavily customized set of menus. Yes, I am aware that FVWM is a window manager not a desktop so they are not directly comparable. But not everyone needs a desktop (thus the heavily customized menus).
If you prefer Windows, use it. Linux is not in competition with MS, it is an alternative. Both sides have their zealots but I look at it as “use what gets the job done for you.” Not “use what others tell you will get the job done.” There is room in the world for both MS and Linux, and I don’t think either are going away any time soon.
I mostly agree with this “rant”. Footprint seems to be ignored by many in the OSS community, but I was pleased to see Mozilla & Opera pushing for less intensive implementations of a web forms upgrade. Mozilla Gecko was originally supposed to be very small and fast. They seemed to have lost sight of that goal for a while, but Firefox is an improvement for which I’m grateful. A modern, standards-compliant browser most likely will never be thought of as small or light…
I’m a longtime Linux/Gnome user. I used Redhat 4.2 through 6 on an old 200MHz Pentium with 96M memory and was mostly happy with the performance. I now run Gentoo on a Athlon XP 2100+ machine with 1.5G memory, but don’t see a performance increase in proportion to the added horsepower.
What really scares me, though, is the comments from the Gnome community about switching their development language to something like C# or Java. You think Gnome performance is poor now? Just wait. I understand that many developers are more skilled in these modern languages than in C, especially given that a lot of OSS contributors are college students. But the thought running desktop a desktop written in Java makes me shudder (even though I make my living developing in Java).
I’ve been saying that Fedora and Mandrake are bloated for a while now. You think that just because you’ve posted an article about it that the fanboys will pay attention now?
Good luck.
At work I have to use Windows XP, but have Debian Unstable installed in VMWare and have VMWare running 24/7. I have many of the same applications running in both, OpenOffice, FireFox, Opera, Thunderbird, gaim, gimp, etc… I find Windows to feel QUITE PRIMITIVE compared to KDE 3.2!! KDE has a LOT of nice touches that Windows doesn’t. Bloat? Maybe, but I feel much more productive with the ability to customize many more keystrokes, gestures, etc, etc. Many of those small touches make a huge difference in my user experience. I have almost all eye candy turned off in both OSes, though I have more turned on in KDE than in Windows.
That said, on this computer w/ a 3.0 GHz processor, 2 GB ram, WinXP as host and Debian unstable as guest (with 512 MB ram allocated) I started and ran each of the applications listed above, timing how long it took from clicking the shortcut until the application was stable and ready to use. In every instance I got nearly the EXACT SAME results in both OSes, even though Linux was at a distadvantage due to having 1 GB less RAM and running in a Virtual Machine. In some cases Linux was .5 second faster, and in some cases Windows, even with the same app.
Considering how much MORE eye candy KDE has, as well as those numerous “nice touches”, I’d say that KDE is QUITE impressive.
The Linux kernel and core utilities (GNU/Linux) is still small, fast and powerful and is getting more so with time. XFree86 (and derivatives) is a monster software suite but it has made standard desktop systems possible. Recent moves toward XML style config files, CORBA style object management, scalable vector graphic images, chrooting everything and application/menu hierarchies exceeding 10 levels iserve slow everything thing down. All this trash is I/O bound making extra memory and cpu speed a waste of money.
I think we need to distinguis between Linux the lean, mean kernel and the fat, lazy applications that sit on top of it.
Commerical groups are interested in sales not software design. Open source project are the opposite. We must leave room for both groups to co-exist but perhaps not in the same room and certainly not on the same computers.
>Now that X is being developed again things might improve,
Yeah, until the decide to intergrate the cool stuff in the freedesktop.org server(giving transparnecy and the like..) vooom.. X uses twice as much memory. But it doesn’t seem to consern anyone if you look at the mailinglists
I am not a Linux geek – I am a basic user of the platform for some fairly basic needs.
I had used Linux a couple of years back in a large scale trial of deploying used hardware in schools in India. The advantage that Linux had over Windows was that we could step up the performance of computers with older hardware. It was a very important consideration.
If it is indeed true that Linux’s hardware requirements are going up to match Windows’ (and there seems to be a general agreement on that), Linux is loosing an important raison d’etre in minds of people like me.
We want software that does not continuously place increasing demand on hardware – not only from an economic angle but also from the perspective of additional effort needed.
Listen guys! we need an OS that is usable, dependable, cost effective and hassle-free. If Linux cannot fulfil these needs, we will go else where, even to Microsoft.
i think the problem is that people are using the latest versions of gnome and kde on their old machines.
a PII 400MHz with 128MB RAM will run win98 and win2k fine, but winXP will be noticably slow. itll also run KDE<3.2 and gnome<2.4 (i dont use either, so i dont know what older version would be more suitable ^_^;;)
when you install windows, you dont automatically get XP and all its crappy bloat unless its an XP disc. when you install a linux distro, you get the newest (or close to) version of all this software.
you wouldnt install winXP on a P75, so why would you install KDE 3.2?
if you have a slow box and you keep it updated via apt or emerge, it will be slower after a year, simply because we live on the bleeding edge (compared to windows) of software development.
everyone wants the latest version. thats how MS gets businesses to buy WinXP even though they have Win2k throughout their office. linux software isnt getting too fat, people are just mismatching versions with their hardware.
Even if there are certain machine expectations for hardware, there is no excuse for writing efficient code. As open source goes, we can look at code and see what parts of it are inefficient and write the author of the software, or modify it ourselves and publish it. Also, we should appreciate the fact that a multitude of projects even exist.
It just keeps getting slower and slower. If I could, I’d go back in time to 1997 and throw Miguel into the path of a speeding bus.
I totally agree with the author. I use all three: linux, windows and OS X. Even by ditching Gnome/KDE for a light WM, you still run into the performance bottleneck with Openoffice and Mozilla. File compatibility with MS Office is necessary and Openoffice does a pretty good job with that, however it is a DOG to run. On my iBook G4 (800 MHz) Openoffice takes so long to open that I get pissed off while I’m waiting. I deleted it and use MS Office solely because it is much more responsive.
This pendulum swinging from one side to the other isn’t getting anyone anywhere. “You have to look at this and that.” “Oh, Windows takes up XXXX of resources.” Everything he says is totally true – with the tools, desktops and hardware that he was trying to use.
No other general-purpose OS in existence has such high requirements. Linux is getting very fat.
Windows XP does need 256 MB if you want to get anything meaningful done and get it to run at a nice performance level.
His box, an 600 MHz 128MB RAM system, ran Windows XP happily, but with Mandrake it was considerably slower.
With Mandrake, I totally agree that he would have found it terrible with what he was using. Considering he was using GTK-using applications in Mozilla (cross-platform toolkit) and Evolution – no shock there really. And of course, we all know that Open Office is functionally good but it is an absolute dog to run. It wouldn’t be so bad if it was a dog when you had many applications already open, but it isn’t. It is slow to start-up whatever environment you put it in. Hopefully OO 2 will be better, because it is a big enough project to focus on getting that right.
I don’t know why he didn’t just use KDE applications to be honest, especially for web browsing and e-mail. They’re all there, and that’s what Konqueror and Kontact are there for.
If he wanted a fast desktop with pretty good hardware requirements he should have just used a 100% KDE desktop. I’ve ran KDE 3.1 and above on a few of P300s with 128 MB and it ran great. I’m not just talking about starting the thing up, which is how Windows requirements are calculated. I’m talking about running seven or eight applications all the time – e-mail, browsers, music etc. I think the situation has got even better for KDE 3.2.
Once you start using a lot of GTK-using applications, not just one or two, your desktop really grinds to a halt. I’m not really that surprised he found Gnome and Mozilla on Linux slow at all. I used Evolution for two and a half years prior to switching to Kontact. Features-wise it is a great application, but you could never run it and other GTK apps on a system with anything less than 256 MB of RAM. Using DDR instead of old SDRAM made a huge difference to the performance of GTK applications, whereas it made a moderate, but noticeable, difference to Windows and KDE as you might expect.
He’s right and wrong. You could run NT4 or 98 in a business with 64 MB at all. I know, I’ve ran that amount of RAM and worked 9 – 5 with both, and that was five or siz plus years ago. Linux desktops should focus on working extremely well within the limits of 128 – 256 MB, and by that I don’t mean that being the minimum requirements. KDE has proved that that is possible. You can get a pretty reasonable desktop in those limits, and it is not entirely unfeasible for companies to upgrade the memory in their desktops. In terms of eye candy there are is a lot of 3D hardware, most of it on board, that has never been used on a business desktop.
Listen guys! we need an OS that is usable (Depending on your choice of desktop, check), dependable (Check), cost effective (Check) and hassle-free (Maybe). If Linux cannot fulfil these needs, we will go else where, even to Microsoft. (If you want to run a current Gnome or KDE on older systems and can not get the performance you want I guess you will be better off in Windows, your choice. If (big if) you have the experience in running a large scale Linux deployment and prefer it, I would argue buying more memory for the systems and running your prefered Linux distro would be more cost effective and easier on maintenance then purchasing licenses for Windows.)
You could run NT4 or 98 in a business with 64 MB at all. I know
Above should read ‘couldn’t’. I have enough of typing today….
I can totally run mspaint.exe and Solitaire at the SAME TIME with only 128MB RAM and 384MB of swap committed. As long as I don’t win at Solitaire, the thing is ROCK SOLID.
If you can get it to run on 300Mhz and 128 mb, how much faster will it run on 3Ghz and 1 GB?
I feel that legacy toolkits such as gtk1, fox and motif are to blame for the bloat. When you load up one these applications not only do they take up extra memory due the different libraries, they look ugly and unconsistant with the rest of your apps.
There are still a few applicatiosn using them such as xmms and gnucash. Major distros such as Fedora, Mandrake, Debian need to dump applications using these from their official sources and put the pressure on legacy applications to upgrade to gtk2.
When you are running 100% gtk2 applications in conjuction with XFCE, it is really fast and a there is lot less memory usage. The next step would be to re-write KDE in GTK, but that would probably be too much to ask than to simply not to use KDE applications.
The author proposes we go back to the dark age of software use and development:
* Small and efficient as possible code almost always means less portable which means software can cost more
* Small and efficient as possible code often takes a lot longer and is usually harder to debug than something written quickly, with basic efficiency, and generically
* Current software development and even hardware development is centered around the idea that the programmer should not have to micormanage memory or other resources, the base operating system or hardware should take care of it (a good example of this is how C# and Java pretty much do away with a programmer being directly involved with almost all memory allocation and deallocation)
* Software can be a lot smaller “footprint” wise, if we decide to all go back to using ASCII, having no internationalization support, and pretending everyone speaks english and uses a single measurement system. Support for multiple languages, translations, localizations, etc. all increase a program’s “footprint”. This is a huge source of “bloat” if you look at it that way in many applications.
* The “Linux Platform” (as the author sees it) could be a lot faster in many cases if hardware manufacturers were open about their hardware specifications. There are many advanced hardware acceleration features and bits of functionality that Linux users are forced to do without because their manufacturers refuse to provide that information. This is no one’s fault but he manufacturer’s.
Now to be fair, I do agree that it seemed like programmers used to be able to do a lot more with a lot less. But, let’s put things in perspective. I think while many programmers that started programming in Assembly Language or plain C that now use a high level language like Java, C#, Python, Perl, C++, etc. may have fond memories of lower level language efficiencies and control, but very few of them miss the headaches and the large amounts of time wasted trying to debug or write in a low level language.
The author needs to step back and realize that hardware is more powerful because users always want more, and market realities demand that software comes to market faster and faster. To go back to the days of super efficient software would require many sacrifices, time being the main one, and time is money in this modern age.
The real solution to this problem lies within changing the expectations in the market, and I don’t see that happening…
I’ve noticed it seems to be much easier to get properly architected, fully functional software to run fast than it is to get fast software to be properly architected and fully functional. Build it right first and trim it down after. The dekstop focus of Linux based software is new.
Give them time.
I’m verry impressed with Firefox. I think a lot of good work has gone into making it smaller. It would be nice to see similar work go into gnome, X, OpenOffice, and lots of other little Linux based apps that make up the desktop. It would be nice, but if it doesn’t happen in a few years, it won’t be an issue anymore. It would be nice if it happened. Now that they have things working rather well on the Desktop it might be a good idea to go back and spend a little time trimming things a little.
I’m very impressed with the results in Fedora Core 2 so far. I’m running it on 2 machines with 1gb and 512 mb of ram so the bloat doesn’t bother me. It is a good idea to make Linux distributions work well on old hardware. I’m ok with them making it work well first and focus on the old hardware second.
http://www.amdzone.com/modules.php?op=modload&name=Sections&file=in…
I was at a local community college, and we were using these “ancient” machines. K6-2 ~400mhz systems. They installed Redhat 9 and, as expected, they were slow. Sometimes people would click a icon and the app would show up up to 5 minutes later (mozilla, OOo). Me, I installed ROX-Desktop. Kept MetaCity because OroboROX wasn’t out. Instant speed boost. Even using Mozilla and OOo was possible. Now instantanious, not by a longshot, but they loaded in “reasonable” times (~30 secs). Still not by any measure acceptable, but it was “usable” for what I was doing.
At the end of the day, the problem (with the major DEs, at least) is that GNOME and KDE are not bothering to worry about the real problems they have. They toss everything into a large, monolithic design that doesn’t scale well. They don’t bother trying to keep things small, because the developers use increasingly beefier machines which even then don’t load fast enough to be acceptable and can justify it under the false guise of “intergration.” A user expects and needs a quick-loading, responsive and easy to use DE. If they worked more on compartmentalizing the various parts of the DEs, and less on “intergrating means everything is one app,” they’d have more stable and faster DEs, less to debug, and using DBus or what have you STILL have easy integration among the various parts, only now it would only be the parts the user wants and needs.
Don’t use fedora…
I started out w/ redhat and mandrake while I tried out a few
distro’s a few years back but quickly moved on to slackware.
Now I use Gentoo and Debian but I’ve been tryin out a few
distros, one of which was FC2 because I wanted to revisit
RPM now that I know how to actually run a linux box on my own.
The only word that came to mind when I tried apt and apt4rpm
and yum was TERRIBLE. I was impressed w/ how the system
looked w/ the bootsplash and all but as soon as you actually
start messing w/ the system you want to puke.
Linux hasn’t gotten fat, it runs just fine w/ a number of
distros. If you’re the type of person trying to run linux
on a really old box then use debian, arch or another
stripped down distro… If you’re a newbie and all you
have is an old box then use knoppix.
So I’ve gotten tied up doing my 9 to 5 thing, and when I get a chance to check this thread again, it’s been infiltrated by some retard Linux guy with his friggin’ W’s and G’s.
I’ve brought this up many times in the past (and been promptly modded down by Eugenia, which makes me wonder why her and her ego aren’t modding down W/G boy), but OSNews needs a login/message model if it wants to play with the big boys. You can see here the damage that a public posting system can do.
In the same sense though, if you provide a such a login method of providing feedback, you can’t immediately mod down everything you disagree with, or feel isn’t pertinate to a discussion. Something that will be very hard to accept for some of our moderators, I’d imagine. I’m still seeing items modded down that don’t need to be, and then threads that respond to those modded down posts are left intact, causing readers to wonder what the hell it is that they’re not seeing that the person replying did. Not cool or professional.
But if you go with a login type system, be prepared to be a lot more open-minded than you have in the past. If you mod down the people who’ve gone out of the way to become a member for some ridiculous reason, you’ll lose your readers faster than you currrently are with your moderation.
On a similar note, I’ve often wondered just why no one’s invented a method for zapping people who purposely abuse boards such as this, with like 100k volts.
Alternatively, a button which provided the rest of use with a valid home address for posters like WG boy would also be very helpful. Then, when we got the chance, those of us inclined to do so could pay said morons a visit to show off our new bats and lead pipes that we’re so proud of. Now that would be the kind of internet that I want to be a part of!
I came across this site a couple of years ago. Enjoy:
http://www.tinyapps.org/
the average user cannot just turn on gentoo or debian and be expected to use it…the average user doesn’t even know how to run the automatic updates on winxp;) so what they are left with is either use xp or use one of the big three: mandrake, red hat, suse – all of which are bloated more than xp is at first boot. and like i said before, i have my mother’s cpu (p2 350, 128 ram, 4gb hd) running xp just fine. she runs post-it notes, internet explorer, msn messenger with messenger plus, and wordperfect all at the same time just fine. sometimes she throws in websters dictionary and a game at yahoo.com on top…no problems.
i really dont know what ppl are talking about this disk thrashing or long load times and that. a bunch of crap if you ask me. the article was very good. it obviously brought out some good discussion. companies need to develop for older systems in mind – as well as the latest and greatest -to get the biggest exposure in the market. if linux distros that are easy for the casual user to use will focus less on bloat at first boot (and allow you to turn on features you dont really need later), then maybe they can penetrate the market where longhorn will fail: the older less updated market.
quit looking at this article and thread through your own geeky eyes…not everybody can run from a command prompt. look at what’s best for linux as a whole. some ppl rather spend $50 on more important things than ram or other upgrades, you know?
I’ve got a 266 MHZ laptop with 192 MB ram in it that runs Win2k just fine. Current versions of linux with X, Gnome/Kde/even IceWM, and mozilla firefox as the only things running were unusable with several distributions
“The next step would be to re-write KDE in GTK”
Why not rewrite GNOME in Qt?
“Major distros such as Fedora, Mandrake, Debian need to dump applications using these from their official sources and put the pressure on legacy applications to upgrade to gtk2”
Wow, forced upgrades. That strategy sounds familiar.
Phew! That’s the longest comments section on OSNews I’ve ever seen.
OSNews is turning more and more into a second Slashdot every day. Congratulations, Eugenia!
linux isnt getting fat, its those shitty ass distros that are getting fat.
fedora for example has what 5 different word editors, another 10 text editors, 4 spreadsheet editors, 10 different media players, a kernel with support for nearly EVERYTHING compiled in. Fact is, linux ISNT for novice users, if you cant tell what packages to install, what to compile in and what to not compile in the kernel, YOU SHOUDLNT BE USING IT. Fedora, mandrake, etc, etc are a joke, why even use them? Ive used slackware for years, has been better than windows always, maybe not for openoffice (admit it, i havent seen anything that can beat ms office), but when playing games, it has amazing performance. I just installed gentoo couple months ago, the performance is even better! Load times on UT2004 are 1/10 of what they are in windows. Literally it takes 5 minutes to load some maps on windows, those same maps in linux, 30 seconds! Why dont you think about your “silly rant” before posting something like this, just sounds like you dont know what the hell you are doing
The next step would be to re-write KDE in GTK, but that would probably be too much to ask than to simply not to use KDE applications.
I hope that was an attempt at humour.
Honestly, saying linux is getting fat because you have tried only the BIG linux distros is like saying “I broke the Internet” because your dialup connection is down. Don’t pass judgement on ALL linux distros when you haven’t even tried ALL linux distros. Have you EVEN tried any smaller distros? Even CDROM based distros like DSL (Damn Small Linux) or SLAX? Take a minimal PC, let’s say one with under 256MB RAM……let’s go even lower and say 128MB RAM…..then install VectorLinux on it…..then do your testing. Pass judgemnets based on fact….not opinion.
—Just my 2.46 cents—
Wait till they start using Mono or Java purely for desktop development.
The goal of Linux is to beat Win or OS X. First, of course, Linux should be applied on the present high-end box. Second, Linux should take the share from Win in the old boxes. I prefer a slim, beautiful and usable Linux, and so I still stay in Redhat 8.0.
my feeling about gnome is that almost each and every release have been faster than the previous. Gnome 2.4 is so much faster that previous Gnome 2. And Gnome 2 is not slow on a PowerPC 180 with 192 MB RAM. Drawing performance took a slight hit with antialiasing, some applications (such as gnome-terminal) are slow, but sorry, gnome 2 is fast. nautilus is faster between each releases. I really don’t how people can be convinced that gnome is slow.
I tried RedHat 9 on my aging IBM Thinkpad 1720i (Pentium-II 266 MHz, 128MB RAM). It wasn’t usuable. FreeBSD 4.8 on the other hand put new life into my box.
Actually, when that happens a grand deal of code for compatablity can be removed. In theory, just that could boost performance instantly.
Once you also consider that, as the VMs are tuned for GC, size and speed, the apps are as well, meaning you reduce requirements across the board instead of in one or two apps. With the proper profiling support in a JIT of AOT compiler, again you would see a number of benefits from one fix instead of the rather small advances that are made in fixing issues on a app-to-app basis today.
When you look at it as “ha, we’ll take C and put it in a interpreted enviroment,” sure it sounds slower. When you say it as “we’re using a large, static library that takes care of a number of speed, performance and security issues for us,” it’s a much better and more honest trade off.
X windows has always been bloated and slow, it is the culprit. Let’s face it: the community has to ditch this abomination and rewrite the ‘graphical interface’ from scratch.
Recently I’ve installed ASPLinux 9.2 (FC1-based distro) on a Pentium-233mmx box with 64Mb RAM. X Windows? Xfce4 feels just _good_, and all other light WM’s (sawfish etc.) are _very fast_. It’s understood I can’t use KDE3 or GNOME2 on such a box. But look, gcc is _ok (I’ve compiled a kernel, though not with 8 seconds, like they say it is on a 32-processor sparc beast), I can browse the web (firefox browser), I can do just everything that people did when p233 was a hi-end dream-pc.
Now, after working a while with Gnome/KDE.. execute ‘top’, press ^M and look, say, on X server’s virtual memory demands. As for me, it’s more than impressive. As the developers, we have to do something with it.
I installed Fedroa Core 1 on my AMD 3000+ with 1GB RAM recently and it was slow.
This is very sad = /
No, your brain is getting fat!
Why give him Mandrake 10 which was produced a good two years after Windows XP was released, instead give him something born around the same time – Mandrake 8 perhaps, Red Hat 7.3, Suse 8 would have been a better option with [perhaps stepping forward for convenience] Firebird instead of Mozilla. Software grows with time and if you want a fully featured sexy “today” desktop then Mandrake 10 or Fedora is the answer – but it will cost you speed on a lower spec machine. I still have Suse 8 and Red Hat 7.3 cds sitting here and I would’ve given them to any pal asking me to recommend for a P600 128MB machine. I use Suse 8 on a similar spec to your friend in the office every day – works fine for me 😉
Linux has become bloated. Developers constantly upgrade their computers, and make the software for their own system. So software constantly scales up its bloat in tune to Moore’s Law.
While I do agree that profiling, optimizing and debugging are extremely painstaking, costly and arduous, and hardly come for free, there are distros designed primarily for speed and optimization.
Today, the most popular Linux distro notorious for speed is Gentoo Linux, not Mandrake or Fedora. So if speed and optimization is a concern, you might reconsider selecting a Linux distribution designed with that goal in mind.
On the same note, I really don’t know of many open source project that incorporate optimization and profiling into their development process. The trend today is to use languages that are convenient, safe, and secure at the expense of speed and system resource efficiency.
The next generation of software, unfortunately, will be extremely resource inefficient for excuses such as development time, safety and security(i.e OS X, Longhorn to mention a few). It’s unfortunate, but that’s reality.
The question as to which operating system is least responsive is matter of observatory perception. Microsoft and Apple spend time and money investing in ocular(visual) tricks to deceive users into thinking the system is responsive (when in reality it isn’t). Throughput for throughput, Linux is a speed deamon compared to OS X or XP.
Linux and its accompanying open source projects are only begining to focus on the desktop and GUI after years of focusing on raw throuhgput, multithreading, scalability and robustness (attributes required for server operating systems).
I do agree that developers need to incorporate the art and science of optimization and profiling into their projects. But if the trends I’m observing are correct, the next generation of software will be incredibly inefficient, resource wasteful and slow!!!!!!!! even on a supercomputer.
In other words, the next generation of programers will be Python addicts whose mantra will be “Memory is cheap!” (Hence, they have every right to abuse your resources) 🙂 I think programs will only get slower because coders are Lazy, don’t care about machines running on lower specifications than theirs( A typical programmer working for Redhat, Sun, Microsoft, Apple has at least 1GHz of Ram and 3Ghz for processing power), only care about convenience and development time and according to them, “hardware is cheap”.
Contrast that with an era where Real coders had to fit a whole operating system into 4 kilobytes of RAM. Heck, even Vim (one of the lightest editors on Linux) can’t run on 4 Kilobytes of RAM. Moral of my rant, times are changing, adjust and deal with it!
I have been using fedora core 2 on my server for like 2 weeks now.. and I must say.. it has about 92 Mbyte RAM and a 300 Mhz CPU but still it runs well.. and while reinstalling my normal PC I used it to play games even (nothing big though) and still it went very good
so no complaints from me..
I just installed vector linux on p233MHz 64M ram and it is actually very usable and not slow at all.
I am yet to see a computer with 128MB of RAM run Windows XP fine. I mean, with anything else installed. I have installed XP on a computer with that much RAM and it would take forever to load, and would not do much. I have tried it with a 300Mhz proc and 320MB RAM, and not much joy. I experienced many a coaster burning CDs there.
If usable maens that things appear on your screen and you can click on them, then XP is usable but if, like me, and the company I work for, 128 MB RAM on a 633MHz PC means they won’t put Windows 2000 on it, then its another ball game too.
Everyone knows Fedora is optimised for i686, so if you are running it on a 486, then you are out of luck. It runs, or rather crawls. I am perfectly happy with it here. I hardly see and swapping, unless I am compiling something. Right now, with Evolution open, typing in Epiphany, a galculator window open, and an inkscape project open, rhythmbox playing in the background, I am using only 243 MB RAM, and 8KB of swap, basically not swapping. OK its a 2500+ XP processor, but memory usage is generally in line with Windows XP. And I am pretty eye candied to boot.
While, theoretically, there is some element of truth in your postulations, the fact still remains that I will need a lot more system resources to run programs written in VM-based/Interpreted languages.
Take Eclipse[1] for example. On a system with 512MB of RAM, Eclipse is reasonably responsive and seemless to use. However, try using Eclipse with anything less than 256MB of RAM, and the challenges of managed environments and applications become painfully apparent.
The fact is, VM-based/Interpreted applications are horrendously expensive to use, system resource wise. Practically, I doubt a desktop environment based on Mono or Java will be faster or more resource efficient than the environments we have available today.
I’m quite positive that if all the applications on my desktop were written in Java today, it will render my 1.4GHz Athlon with 256MB of RAM obsolete. I already experience problems running just one Java application at a time, talk little of if everything I had to run was written in Java, or Python, or Mono, or whatever.
[1] Eclipse is an excellent open source integrated development environment written in Java that you can install and use for free.
http://www.eclipse.org
Whats the goal of linux. Sure modern Distributions tend to request big computers but they are intended for such computers. Linux ist the possibility of choice so if you want to to use an old computer don’t try to make kde/gnome responsible for this computer being slow. This was nerver the goal of kde/gnome. Try using a different window manager an stop bugging users with the same useless discussion. Linux will never be interresting for normal users if it isn’t preinstalled and if there aren’t new games for linux. so…
This all is basicly useless
Hit the nail right on the head.
If we are actually having this discussion, we have a serious problem!
If linux (ok some distros) is actually close to rivaling a microsoft OS as performance hog, we have a serious problem!
Do the guys working on Gnome/KDE really think about performance when they code? How does the review process work?
I too feel that the bloating has gone too far. I’m not sure weather it’s because of bad software architecture choices or if it’s something wrong att the source level. Maybe a combination. I have seen some horrible open source code
like; using arrays instead of linked-lists when there are a lot of reorganization of elements is usually not a swell idea.
I think the bottom line is that we, coders, have been fooled into thinking that everything should be designed for portability, flexibility and reuse. This means we loose performance. (Not always but most of the time) If we’re designing a module/application that have a well defined purpose. Why not just design it for that specific purpose.
I think that many people feel that their code looks so much better with a bunch of inheritance levels, dynamic run time memory allocation and on top it all, a fancy reuse pattern.
“X windows has always been bloated and slow, it is the culprit. Let’s face it: the community has to ditch this abomination and rewrite the ‘graphical interface’ from scratch.”
HAHA! Spoken like somebody who has yet to run X on DirectFb. As far as KDE goes, it is there as much as it can be for now. All of the necessary parts are in place. 2.6 kernel sped up the entire system. KDE 3.2.x series offers many speed improvements. The GTK libs are being cleaned up and will eventually be faster. The only thing we are waiting on is X.org. I am very anxious to see the speed improvements as it develops.
I am an XP user that has heard about hwo great linux is from his friends for years. Makes everything better, yada yada. The thing that got to me though was the whole Distro thing. I have looked into switching, but everywhere i go it says “you msut go get KDE/GNOME/WINX/An ice cream sundy in order for you to have a good experiance and be able to use anything or do anything, but our thingy doens’t come wit this, so you need to go hunt down the right binary for it on some other site that we haven’t linked. but make sure it is the right binaries for the compileer that we use in our distro or when compiled H4X0R5 will steal your MHz.” Its a wonder you all haven’t complained about it being loaded before. Its jsut about impossible for and average user to get started with linux. Not only that, but if i switched, half my games and programs would not work, at which point i would ask myself “why am i using this if i can’t run anything i want to?” You all want choice and stuff and it has nded up with every Joe Friday making his own distro and making it impossible for someone to pick the one that will do everything he wants, becuse he has to see if there is a distro made that supports it all, and if not he has to learn to compile his own after getting some other limited one. The only way that Linux will EVER compete with windows is the day that you simply put the disk in the drive, it installs, and then it is ready to go with anything you want. Plug and play install, functionality in all programs and games, and speed. Lots of it. “but i want to be able to customize it” So put stuff in that allows for customizing, but don’t make whole new distro’s based on new user settings and crap. If KDE and GNOME are required theses days, mix them into the OS or something so that people don’t hav to try and figure out why thy can’t do anythig on this “better” OS. If i want a free OS that works and does want i want i can get XP of the P2P or from a friend. Untill the time that you turn it on and it jsut goes and does what it is supposed to, it will always be lacking. Crap, get the XP pro Source and then make a Secure OS from that that works better then the actual XP and we will all be set.
Amen!
I’ve been noticing this for years. Especially with KDE (mainly because that’s what I used to use, not that anything else is any better). I loved kde 1 …kde 2 wasn’t bad.. now kde 3 is out, and it’s a dog sometimes. I switched to fluxbox for a while, and just recently went back to kde, but it’s beginning to get to me again and I think I hear fluxbox calling.
Seriously, apps are getting bigger and bigger and bigger. I’ve often wondered if software writers get kickbacks from hardware manufacturers to keep requirements up so the hardware makers can keep selling new machines.
As in, “The proggy is very developed, the algorithms are blazing fast, the GUI looks good, it’s been tuned, tweaked, and chopped, so it only uses half the space it used to, and it can do twice as much as it used to!”
That used to be the mantra of a whole generation of coders. I guess that they’re all dead now. My XP box isn’t big or fast by todays standards, but it’s the best I can do. It isn’t much faster to use than the first 4.7 MHz XT I put together 20 years ago. It’s a lot prettier, it does have some capabilities that just weren’t available 20 years ago (due to cheap hardware, not any great advancement in programming skills).
Menuet is refined, the entire install fits on a SINGLE FLOPPY DISK, not a couple of CD’s or DVD’s. QNX also fits into this catagory of “refinement”. OpenBEOS looks like it’s trying for this also.
Linux used to be for old machines, for people that couldn’t afford any better, for students, for people with limited resources. Now it looks like all the Linux authors have learned their lessons from Microsoft exceedingly well. I guess the poor will just have to drop back to using FreeDOS, since their machines won’t run “up-to-date, supported” versions of Linux.
*Sarcasm* I’m sure Linus would be proud. *End Sarcasm*
Internationalization (non-ASCII locales).
Anti-aliased fonts.
Smooth scrolling effects.
Too-fancy themes & windows.
Unneeded monitoring etc. daemons (mdmonitor, portmap)
Journaling filesystems
In no particular order …
But it’s well-known that a lot of work has gone into supporting non-ASCII non-English environments, and one effect of that has been new bugs, distraction from optimization, and slowing down stuff like grep, sed, awk, which used to be fast. I’m sure it’ll improve, but this is breaking new ground. Believe it or not, newer OSes are tackling more issues, and bigger issues.
Having said that, I’ll stick with my now-ancient Mandrake 8.0 on my workstation for now. Everything feels faster … On servers without X, the newer GNU/Linux releases are excellent.
I don’t agree with this article at all. My brother and I have the exact same machine, I run FC2 and he uses XP. I’ve have yet to have an kind of crash and rarely does anything ever slow down. I can’t say the same for when I head over to my brother’s computer. IE crashes left and right, the startmenu is terribly slow, he has to reboot constantly, etc. Sure Fedora might not be right for your 1995 233mhz computer lying around, but that’s where Slack comes in nicely. And if Linux won’t run speedily on it, I’m positive XP won’t either.
We can have the features as well as the speed, but we need an environment that makes efficiency more transparent to the developers.
We need an X replacement which includes a standard and efficient toolkit, network transparency as fast as MS remote desktop, and local efficiency that’s good enough to make all the embedded graphical layers redundant.
Linux not only has potential for the older PC’s, but an entirely new market – that of handhelds. If the embedded machines run the same graphic layer/windowing environment as the desktops, we unleash whole new possibilities.
It is possible.
Try Windows XP Embedded.
Customizable Module based OS just like Linux, pick what you want, and that’s what you get.
Don’t Want the IE binaries and dll support, or even the explorer shell? you don’t pick it… put in what you want.
It’s amazing how fast this is.
Setup a computer for my grandmother with 64mb ram, and 300 celeron… Total crap machine, but it runs Windows XP Embedded, with the explorer shell and IE, and Office 97 very well… I peeled out a ton of other stuff, like sound support, since it has no sound card and she just needs a basic wordproccessor/email/internet. It has support for the modem thats in there, and it has a software firewall. I built in CA Etrust Inoculateit 6 as well, in incoming only mode to enhance performance while providing decent AV.
It boots in about 17 seconds to the desktop, and all themes, etc are disabled… Word 97 launches very fast (5sec), and IE and her webmail as well. (3 sec)
I’ve tweaked the memory management of the registry, setup the pagefile at the beginning of the disk on it’s own partition, and some other general XP tweaks.
Did I mention it uses about 350MB of hard drive space for everything? I have an old 1.2 GB caviar drive in there…
I daresay it was faster and easier and you end up with a better end result than if I had tried to go with linux…
Also I can walk her though over the phone some simple click around and change settings in a GUI, rather than explain how to type exact console commands that are case sensitive, etc.
Even with more hot features in KDE & GNOME and the GUI in general …. I see all these visual improvments as extras … they are eye-candy … they do not add to the productivity of the environment …. with all silly features disabled KDE future should be as fast as KDE now or KDE yesterday in an idial world or faster through development .
I agree with all the stuff in the article …. the big Linux-Desktop-on-all-computers-out-there-dream will burst quite easily if the Linux experience is slower and therefore less productive …. linux might be gettin far in markets where the kernel qualities & capabilities (my guess … I NBee) are of importance but the main Desktops (GNOME,KDE) must be addressed soon as Win98 support has finished .. people are gonna be lookin 4 somethin cheap & fast as a replacement and I would hope 4 it 2 be a Linux distro which can please the user and might even make them feel proud to be running Linux & of course Longhorn is comin some time in the future …….. so that OpenSource software can spread to Desktops ….. like Offices and other non-geek or non-experienced …. places .
Maybe a new .org will spring up dealing with Linux GUI responsivness …
After all this moaning …. OpenSource is a great idea and
it has all come very far ….. just hope it doesn’t derail due to too much enthusiasm 4 more apps that will bombard Windoze ….. quality not quanity …. there is a big enough choice … but now to the quality ….
GO LINUX 🙂
How about an adaptive user environment …. automatically different setings and optimizations SPLN? on df systems .. GUI adapts to the system’s abilities ?
But then I’m lazy and love suggesting stuff without doin so much myself .
xlynx, before talking about excellency of Remote Desktop, try NX. It beats RDP flat. http://nomachine.com/
have looked into switching, but everywhere i go it says “you msut go get KDE/GNOME/WINX/An ice cream sundy in order for you to have a good experiance and be able to use anything or do anything, but our thingy doens’t come wit this, so you need to go hunt down the right binary for it on some other site that we haven’t linked.
This is completely false. ALL modern Linux distributions come with X/KDE/GNOME and everything you need to have a fully-fonctional GUI. It gets installed automatically, too. You mustn’t have look very hard…
And to those who say that X is slow, that’s not true at all. X is a great protocol, some people feel the UI is not totally as responsive as Windows because X lacks desktop double-buffering – but this feature is coming pretty fast, with the compositing package developed by the fine folks at freedesktop.org.
On my Athlon 900MHz KDE with X is quite snappy.
I have run FC2 on machines with 256mb of ram or less with no
noticeable lag — here’s how:
1. When you do the install — do not use the graphical installer — do the install with the text-only switch from the installer prompt — you get the older ncurses-based redhat installer which still works fine…and uses less ram…
2. Never do a “desktop” install — always choose either “server” or “custom” install — this grants more leeway as to what packages get installed…
3. Your swap partition does not need to be as large as the installer says it does…Linux does need swap and by common
practice the swap partition should be at least the same size
as physical ram — but it does not have to be so — Most of
my systems run FC2 just fine with between 80-150mb swap —
and I have seen large swap partitions slow stuff way down — do not know why — but it does happen –
4. I personally have never gotten gnome to run at acceptable speeds on any pc regardless of specs — but I have gotten kde (even kde3.2) to run fine by being careful with the bells and whistles slider setting when kde runs for the first time– low spec system set the slider for less bells and whistles —
5. After install some gnome junk will get installed anyway… ruthlessly go through and remove it with rpm -e –nodeps gnome* — leave the gtk and gtk+ libraries though you will need those later on…
6. Manually edit your .xinitrc file to startkde — if you do not do this — then redhat by default tries to start some gnome stuff even if it has been removed…
I have done these steps even on a pc with 128mb ram and kde still worked fine…
7. for test purposes install at least 1 light windowmanager
like icewm or windowmaker or fluxbox or twm etc. etc.
Recommendations?
Jeremy Friesner wrote “Any recommendations regarding distros to try for this?”
The folks at dynebolic will make a customer live cd distro for a very reasonable price:
http://dynebolic.dyne.org/index.php?show=cust
and my two cents on the article: He’s right that the main distros are getting fat. But there are some light distros like dynebolic which are pretty cool and light on hardware.
it’s not the programmers fault, each of us who code for living
know the thruth :
it’s incredbly boring to wait forever until the code is compiled so in order to gain productivity in the work chain (code, compile, debug, test) there’re only 4 option :
1° code faster
2° compile faster
3° debug faster
4° have a lot of bug report goto 1
the point 4 is easy : release often (oss mantra)
point 3 too : forget about memory management : mono, java, python, perl+gtk… using of widely used libraries (glib, whatever), forget about designing clean gui code (using of tool like glade, eclipse or other)
point 1 is hard : we have only two hands and 24h a day and some of us has a life
point 2 is easy : buy a lot of ram, buy some new athlon 3600+, faster harddrive…and the all chain is accelerated… so the productivity has increased by a ten factor…
why that ?
because in the open source world, there’s no gain with selling of programms, there’s gain with support or other things, so the developpement has to be at the lower cost in order to go on.. it’s like a vicious circle, all is free, no money to spent hours to optimize stuff, for what reason ? every 6 months speed double…
for the third world ?
we, our government, force them to borrow money to the world bank in order to pay the machine they will used…. so they’ll give it back to us…
for students ? in thailand the linux computer has managed to increase the copying of illegal windows version by 100 in 6 month, in the real world only geek care about linux…
the only, only reason to switch to linux is to be away from virus and shit, and it’s just a matter of time we get screwed by that too…
In all my machines (P4-1.8GHz, 512MB – P4-2.8GHz, 512 MB, K7-500MHz, 320MB), Windows 98 and Windows 2000 are snappier. Further, you can run more applications on them with less memory compared to KDE 3.2/GNOME 2.6 (kernel 2.6).
I really love LINUX so even if it is not snappy, I still used it now. Yup, LINUX distros are getting better from Mandrake 7-10, Redhat 5-8?, kernel 2.4-2.6. But I hope we could make it snappier than Windows. If we do this, in my opinion, we will kill Windows regardless of current LINUX application states.
So please work on the performance areas. I believe in the open source community.
But I hope we could make it snappier than Windows.
X development is quite dynamic these days. I believe that the XFree86 implosion has given it a new momentum, and there’s lots of exciting stuff on that end. On the DE front, I know that KDE keeps improving in performance (can’t speak for Gnome, I personally don’t use it). Anyway, expect a snappier Linux desktop experience soon.
If we do this, in my opinion, we will kill Windows regardless of current LINUX application states.
Well, I don’t think that’s going to change things that much, but it will certainly help. I know I’ll enjoy it (though I really can’t complain now).
Further, you can run more applications on them with less memory compared to KDE 3.2/GNOME 2.6 (kernel 2.6).
That’s not quite true, actually. Memory usage is different for Linux and Windows. Sometimes it may seem as though more memory is taken, but that doesn’t mean that all that memory is unavailable – a lot of it is cached data, kept in memory in case they’re needed, and flushed if the memory is needed for something else.
So please work on the performance areas. I believe in the open source community.
Keep the faith!
Everybody has an opinion on this.
Mine is that this is the fault of distro’s trying to be too much, (just like MS does) GNOME & KDE both are too bloted, so dont use them.
I use icewm, firefox, thunderbird (probably a bit mem hungry)
I bypass the desktop environmant, for a filemanager I use Midnight Commander.
Now- Not everybody is like me, so..
ROX+Firefox+Thunderbird+Icewm – too_many_deamons = runs_well_on_older_box
I think GTK2 needs some speedups too.
Using Linux these days give the impression that your fast machine is already obsolete and left behind. Boot times, application execution and memory usage became extremely terrible and I started to notice this since Red Hat 6.0.
That is the article I really wanted to write myself. Something I always wanted to put out but never knew how to do it. Thanks Bob Marr did it.
It’s funny, but I’m currently using right at this very moment on a Pentium 200 MMX with only 160MiB of ram with KDE *and* compiling something at the moment and getting decent performance.
Using Linux these days give the impression that your fast machine is already obsolete and left behind.
I don’t get this with my Athlon 900. to me it feels pretty fast. Note quite as fast as the Pentium 1.8 I’ve got at work, but really, the difference is negligible (unless I’m compiling, that takes less time on the Pentium).
Boot times, application execution and memory usage became extremely terrible
Boot times are comparable to Windows if you measure both from power on to the time when the OS has finished loading. MS does a nice trick of giving you a GUI quickly while it continues to load the system in the background. But is a few seconds here really worth all the hoopla? How often do you reboot? If I look at my current uptime: 46 days for the server (power outage) and 5 days for the workstation (new kernel)…now, maybe Windows gives you a GUI, what 10 seconds faster? Does that really make such a difference over such longs periods of time?
I’m not sure what you mean by “application execution”, but clearly you don’t understand Linux memory usage. Just because the memory is shown as used doesn’t mean it’s not available – a lot holds cache data that is flushed if the memory is needed.
and I started to notice this since Red Hat 6.0.
Actually, in KDE’s case, performance has increased with KDE 3.2 Those guys are doing a lot of optimization.
We all know that linux is a great OS for servers. There is his market for now. The old day gurus came with pretty good solutions and a great philosophy. KISS. But when we talk about linux as a desktop system, we see another philosphy. Developers aren’t setting their priorities like in the past. Instead of focusing in resources they focuse in the user. Priority no.1 must be efficency, quality, and simplicity. One thing does one thing, but it does it well. Then, when everything is working really great, start adding fancy non-productive features.
ROX is great. I recommend ROX for any machine where Nautilus/Konqueror is too slow. It has most features a file manager needs…
I know it’s been said, but I just want to point out a misperception: With GNU/Linux, the perception of speed in the DE’s is not affected by cpu cycles with any machine above 233mhz; It’s affected by minimum RAM. I am running a cele400 with 512mb and it hums along quite nicely. I even noticed a significant “speed” increase UPGRADING from Mandrake 9.2 to Mandrake 10. If you have enough RAM, the optimizations in KDE and the 2.6 series kernel are really noticeable. Yes the RAM footprint is larger, but I wouldn’t hesitate to run KDE or Gnome on an older system as long as I had at least 384mb of RAM.
This is exactly why I use FreeBSD whenever possible. It is as lean and clean as you like. FreeBSD puts everything great about UNIX into a powerful and consistent package. And the ports system is miraculous.
Wow! Just wow.
I can run linux quite comfortablu on an old laptop with a p2 266 and 96 mb of ram, running kde3.1 and the old 2.4 kernel. I can run OO just fine!
I can have all the themes I want and not viruses, spyware or stability issues!
But hay Linux is huge right? Get real! Have you guys looked at longwhore’n. The thing needs Gigs of ram!!!!!
even Xp is damn slow unless you have over 500mb of ram, so is win2k3!
More fud from the m$ camp!
I’m running Mandrake 10.0 dual boot with XP on four desktops PCs with the Gnome desktop, and I really don’t notice any performance loss vs. XP on the Linux desktops. At work, where my Mandrake 10 desktop is the only Linux box on an 18-station Microsoft 2000 network, the Mandrake is blazingly fast compared to XP, which is horribly bogged down by McAfee’s antivirus program running in the backgound on all the Windows desktops.
I’m also running Mandrake 9.1 on an ancient Sony Vaio laptop with a Pentium 366 and 128kb of memory. Works better than fine. Recently at conference where the hotel offered free wireless, my Vaio with an Orinoco card worked flawlessly. Five out of six colleagues with XP laptops couldn’t connect.
OpenOffice, granted is a bit slow compared to Office, but it really is good enough once if finally opens. Evolution is infinitely faster and more intelligently designed than Outlook. Two other programs that I use to help put food on the table – Quanta and gFTP – are better than Homesite and CuteFTP, their Windows counterparts.
K3b is easy to use and reliable for burning CDs, including images, as Roxio. Xine, Totem and mPlayer all work fine for multimedia.
When it comes to doing stuff in Windows that I can’t do in Linux, the list used to be horribly long, and now it’s pretty much down to Quicken, TurboTax, my USB flash key and my camera’s SmartMedia card. And all this stuff that does work does so without any hacking, which I am not very good at outside of PHP.
As systems grow more user friendly, the engine under the hood gets more bloated. That’s a tradeoff I’m more than happy to deal with. With AMD 2800 or Celeron 2.8 Mhz machines available for $400 and change, how well some distros run on hardware more ancient than my old Vaio is not a critical issue.
osx barely works with 128mb ram, try multitasking (i own mostly macs, two pcs)
as someone noted win2k and winxp barely runs with that amount of ram.
What’s all the hubbub about? a program .. using.. ram?
get real.
Well, the fact you have failed to run Windows XP with RAM less than 500 MB doesn’t contradict with the fact others do run it comfortably on 256 MB. I did run XP on 256 MB so I know it’s possible. Perhaps you have better Linux skill than Windows skill, or your hardware fits better with Linux. But please, don’t generalize.
I assumed you didn’t lie.
Bob Marr says: “[Havoc] may have talent in writing a lot of code quickly, but it’s not good code”.
Bob Marr, you will note, “is a sysadmin and tech writer, and has used Linux”. Oy.
I think we can all know exactly how good Bob Marr is at writing code.
well the great thing about opensource is that it scales, very well, Bullshit just seems to walk.
I’m personally waiting for marr’s highly esteemed report on the matter, hopefully he can provide us with some patches for the codebase aswell.
Read this: http://log.ometer.com/2004-06.html#10
I’m sure we’ll get a patch from this OS News guy who knows exactly where the bottlenecks are and how to provide equivalent features in 1/10 the memory, due to his extensive profiling of GNOME, detailed architecture review, and overall coding skills. –Havoc Pennington
Linux would-be “journalists” make sensationalist headlines just to get click-throughs. These articles have less and less meat to them. Often they have none at all. They cry about the sorry state of affairs, but don’t say what to do about it. Often they’re written by people who know nothing more about computers than where the power switch is; rarely are they written by people with any programming experience at all.
Linux articles need to get better! Because they suck! You need to write better! It’s bad, bad, bad, and we can’t let Microsoft win! And … yeah! Fix it!
Personally, I agree with the view of author. But what causes the linux platform slow? kernel? Xfree86? KDE/GNOME? apps like openoffice? or any combination of them?
Then what can be done to improve performance?
While I agree about the growing size of the two big DE, I disagree totally about the performance: Gnome 2.4 and KDE 3.2 are running much better today on my Celeron 566 than their ancesters three years ago, which where unusable on my machine. My computer hasn’t evolved since 3 years.
The author makes some good points about the bloat of the latest RH and Mandy offerings, in fact that’s why I switched to gentoo a year ago, but let’s look at what’s really slowing the machine down: seems to me it’s the exorbitant number of services RH feels compelled to enable by default. Strip out all the cruft and your left with a fairly decent box. X is not the problem, and neither is KDE (can’t speak on Gnome ‘cos I don’t use it…)
As far as the point about replacing the OS on all the 95/98 boxes, presumably a company would hire an inhouse, or outsourced linux professional to take care of the installation and tweaking for them.
The better issue is why so many linux users feel the need to ‘sell’ linux to others. Sure some big players such as IBM try to sell linux as the greatest thing since sliced bread, but that is not (nor ever was) the goal of Linus Torvalds, and a large majority of OS software developers. They just wanted to have their ‘own’ system to work and play with, rather than having the software they use dictated by a monopoly.
Is it just me that’s content to run my linux box and feel silently superior without pushing it on every windows user?
I’ve been using Linux for the past year or two and I have to agree that there aren’t that many benefits to using it in place of Windows. It’s a fun little hobby OS, but I think I’m gonna switch back. I’ll still keep it around, but I need to get some work done now. Maybe I’ll play around with a Macintosh in the future.