Amazing is the recent interest in full, live, operating systems that can fit on a 50 MB CD-ROM. It’s totally astounding that they can cram so much onto such a tiny disk. But wait.. let’s run back to the days of old.. back to say 1988.
Note: This editorial is meant for discussion and rebuttal if any is to be had. I’m not here to pick on an OS in general. I’m here mostly because I don’t have the answers and I’m hoping that the smarter people out there can give me the answers I seek.
It’s an amazing and exciting time in computers here in 1988. Apple’s Lisa has caused a stir. Atari has a system that can appeal to the graphic hungry, as does a little company called Commodore with their Amiga. Hard drives are just swelling to almost 20Megs in size if you want to spend a couple grand and there’s a new media out that people think is so much better than those flimsy 5.25″ disks. Soon the world will rattle with the sounds of 3.5″ 800KB floppy disks being filtered through someone’s fingers like large square poker chips. Intel has the 386 chip out and someone has broken the 640K memory barrier in real mode. It’s just amazing what computers are doing.
I’ll use the Amiga for a moment since I owned one and know it’s internals enough to be able to speak with some authority. This machine had 1 MB of memory, half of that was directly accessible to the graphics chip. I used 2 x 3.5″ floppies and hadn’t a hope to have the cash to buy a hard drive at the time. But I had a pre-emptive, multitasking operating system, with a modern graphical user interface (GUI for those that need it.) that I could use to play games, write documents, ray trace, paint, do database stuff, spreadsheet and even use a modem to dial up a BBS. I actually used it to access CompuServe for the longest time. Because I’d learned the command line commands on CompuServe long before it went GUI.
The operating system was easily as stable as Windows or the Mac’s OS at the time. Which means it crashed usually at the most important moments. Like 30 seconds before your 6 hour ray trace finished tracing and would have saved the last bit of the file that would have made it readable. But such is life on the bleeding edge of technology.
Now we’ll zip ahead a bit to 1994. Computers are expensive compared to today. But they were still exciting. IBM compatible computers.. Now generically referred to as PC’s and PC clones. Are the bulk of the computers currently in people’s home and office. Intel’s 486 has been the reigning king for almost 2 years as the primary engine of domestic computing. But the Pentium is ready to hit the streets and tear up data benchmarks we’ve all grown so use to. Windows 3.1 is the primary OS of choice. And though it crashes at a blink, has a clunky interface and needs at least 4 MB of ram to run well. 16 MB will get you lots of zippiness if you have the bucks. It needs however a computer with a hard drive. Which now drives of 60-100 MB can run you as little as $200.00. It’s still a lousy platform for games. And everyone still uses DOS to play their games.
From the depths of Geekdom and it’s techno-rage crowd comes a cry that is hardly heard by the masses save those who still hope there is a better way. People who got hooked on multitasking. Multitasking like UNIX does. Like the Amiga did. That cry was a little known UNIX operating system FreeBSD, and it’s UNIX like brethren Linux.
What was that cry? I hear you ask. The cry was simply this: Windows is a slow, bloated pig of an operating system that can’t multitask to save it’s own life. This UNIX operating system (FreeBSD,Linux,Netbsd) can do what windows does look we put X-windows on it. So you have a graphical interface too. The fact that you could download these operating systems via your 14.4Kb or the speedy 28.8Kb modem was another plus. You didn’t have to pay for them. But really at this current point the fact it was free wasn’t such an issue. A person could find 3.1 Windows and MS-Dos 6 something, lying around the house usually and piracy wasn’t really thick in the minds of lawmakers.
What mattered most about these early multitasking operating systems was that they made the most out of what you had. And they were more stable than anything out there. Computers running months on end without reboot was the norm compared to the daily sometimes hourly reboots of windows. And of course the real price for using these operating systems was you had to sit, read and learn the operating system and how it interacted with your computer. It had a high tech-knowledge price compared to the click-knowledge of windows.
I’ve spent all this time dredging up history to rush you to my problem with where we sit today in the open source world and operating systems in general. Ready… set… point: Can you imagine running a modern Unix operating system on just 16 Megs of ram? I’ll go one better, 32 Megs of ram and a modern Unix system with Gnome or KDE running as your GUI.
Windows XP SP1 boots and runs on average needing around 130-180 Megs of ram just by itself to run smoothly. Linux with KDE needs around 80-140 Megs of ram. From a memory perspective the 2 aren’t really all that far apart. Cut down the auxiliary services and things and you can get XP down to around 80 Megs of ram.
Code bloat has become the norm now that Memory and hard drive space is reasonably cheap. But that same code bloat also slows down the over all speed of the operating system as that much more data has to flow through the processor and ram. ‘But! (you cry) Look at all the wonderful capabilities you have in the Desktop. Stuff almost just like Windows.’
‘Just like Windows’ .. Ironic isn’t it? they say all too often in war, you become what you hate.
To that end people in OSS have for the last 6 years or so really worked hard to try and make something to replace Windows on peoples computers. Something free, made with their own hands, and something that will always be available to the masses who simply want their little box of plastic, silicon, copper and steel to do something for them they think they need.
Is there any way we can reduce the memory footprint of what we build and use to improve overall, our UNIX of choice? Our Desktop, our Graphical User Interface? I remember ages a go when I had a 1GB hard drive and I fit Windows, Slackware and FreeBSD on it all with room to do what I wanted and GUI’s on every one. Today if you’ve only got a GB of disk space you can shoehorn in any of these but there won’t be much room for anything else. Does this mean we are better off? Are we actually doing more for all the space this bloat takes up? Or have we just become obsessed with doing something like someone else.. cough.. MS cough. Or doing something better than someone else.
In my humble opinion, Innovation was the most significant hallmark of OSS. Doing something that was smarter, better, more efficient than the big guy. We still have some astounding innovators don’t get me wrong. Fabrice is an amazing example. FFmpeg, TTC, QEMU just as a sample. Blender is another project that’s just amazing. It’s code is small, but it’s capability is HUGE! Audacity, slirp (the original user land nat/ppp/slip emulator.) These are probably the most amazing projects I’ve seen in the last 8 years of working with computers.
Open office is a great example of duplicating effort for the soul purpose of being free. It’s not really much smaller than Microsoft’s Office 2000. In general it matches most all of Office’s functions pretty closely. OSS advocates say it’s great because it gives you a choice and it’s free. I say it’s still as big a resource hog as Office.
2 years ago people whined.. ‘Mozilla is sooo huge! let’s do something and cut out all this code bloat’ Now people tout Firefox because it’s small, fast and free. When you look at the resulting differences in this anti-bloat driven craze. You realize that when you group Firefox, Thunderbird and Sunbird together you get about the same code size as Mozilla did previously. Here we’ve divvied up functionality to try and be faster, smaller better. but not really innovative.
Here are some questions I have for you to ponder, answer or gloss over:
- How small could we make a functional desktop with Microsoft Windows like features? Memory foot print and file size. It doesn’t have to have every bell and whistle. It just needs to be intuitive and usable.
- What would be the real losses functionality wise if someone built an Office suite that only needed 10 Megs (This is completely Arbitrary) of disk space but inter operated file wise with Microsoft’s Office.
- Could we really accomplish much of what we do today on our UNIX desktops on a machine that only had a 486 and 32 megs of ram?
- Are these desktops and applications huge because we have the room or it couldn’t be done with smaller, better code?
- Would we see any real world advantage by making the desktop, applications, toolkits more compact an efficient? Understand real world encompasses development time too. So if it took 3 months to code something tight that someone could write in their huge code in 2 weeks. Clearly we’d be looking at a disadvantage.
In closing, anyone who wishes to comment on what I’ve said here or who wants to answer the questions, I thank you sincerely for responding no matter your view. I feel strongly that the soul of OSS is in hock for the sake of beating the other team by supplantation, instead of being better than the other team by innovation.
Maybe I’m on this little rant today because I remember my days when I wailed pitiously that my TRS-80 Model I lvl II machine with it’s 4K of ram complained to me so often that I’d run out of room for the program I’d spent 10 hours writing. Because I had no room for strings left, when I typed RUN
Till next time, if there is a next time. Thank you for your time. Sincerely,
Davon Shire.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSNews.
First post.
Your descriptions of 1988 sounded a bit out of sync. 1983-4 was more like it.
Lots of typos and construction errors.
Ah, nostalgia. My first computer was a TRS-80. Later, I could be found playing solitare on a 286 with a monochrome monitor. Sure I yearn for the simpler old days.
We need to face reality. Neither Linux nor Windows is suited for for old or skimpy hardware. They both need the equivalent power of a mainframe, that is what modern pc’s offer, and that is why both are now excellent choices. Code and features have bloated because rewriting is expensive while memory and storage are cheap – economics rather than laziness. If you have taught your students to value “reusable” code, you have helped cause the bloat but it was the right thing to do.
Didn’t I just read this article 3 weeks ago?
http://osnews.com/comment.php?news_id=9637
http://developers.slashdot.org/article.pl?sid=05/02/07/1916205&tid=…
Look at fluxbox, XFCE, etc. for examples of open source GUIs/wms that are lightweight and fast. I guarantee you, if you setup a Linux machine with just fluxbox, firefox, gaim, thunderbird, and whatever other apps you needed, not only would it fit in 1GB, but it’d be much faster and better than your old stuff.
Linux/OSS is about choice. That said, I do agree with SOMETHING this author has to say.
OpenOffice.org is, in my opinion, a good replacement for MS Office, but it definitely isn’t innovative. He’s perfectly right to point that out. AbiWord is a beautifully done word processor, but even that isn’t _too_ innovative (though, I have to be honest, when it comes to which Word Processor I prefer, AbiWord is #1). Apple, meanwhile, a pretty closed-source and proprietary company, has at least come out with Pages, which looks like “something else” in the Word Processor arena.
I think one of the nice things about open source software’s history is being innovative. For example, say what you will about LaTeX (it’s harder to use than a word processor, etc. etc.), but it definitely is an innovative piece of software. With LaTeX, you can typeset beautiful books and articles in a PLAIN TEXT EDITOR and LaTeX does most of the work for you in making layout look good and clean. Want to automatically syntax highlight a section of code in your article? There’s a LaTeX package for that, so that all you do is paste the code, and line numbers and syntax highlighting are taken care of.
I recently had to write a 50-page guide documenting how to maintain some pretty complex system I put together for a client, so that another developer could take over my job. And I simply couldn’t have produced as beautiful a document, as clear a document, in as short a time without LaTeX. If I had used MS Word or OpenOffice, I would have spent 50% of my time formatting it, and the other 50% actually writing it. With LaTeX, I could focus on the writing.
The other thing about OSS is programs haven’t a well-defined interface, which has traditionally been the command-line. One thing I notice that others notice too is that with the rise of the “Linux desktop”, this well-defined interface is disappearing. It’s hard to attach two GUI programs to each other via a “pipe” or something like it. Some developers expose the interface (for example, Evolution’s new eplugins framework is a good effort), and that’s a good step, but I think to be innovative we really need to start rethinking how to make different GUI applications work together in the same way old-style UNIX command line tools used do.
Once that happens, then the UNIX desktop (GNOME/KDE or whatever) can finally be _scriptable_, and then some more innovation can happen when developers find out clever ways of combining two or more desktop applications.
Well, for one thing, on my old PCs at home, I install a very “base” gentoo, and the install KDrive (i.e. XVesa). It runs in only about 7mb of memory as opposed to 16+ for a full X server.
However, if you want to start running apps in the XServer, you run into problems. Both firefox and opera take up about 20mb of memory. So, when running the browser, with kdrive, no window manager, and a couple other services, I’m pushing 64mb. I could get rid of some of those services, and probably will, but, all I need running under X is the browser anyway.
1st PC was an RS w/16k RAM, I hand typed in my code of choice every reboot!
1988 and I had my beloved Amiga2000, 3FDD’s.. finally payed $$$ for a 100MB SCSI HD and 8MB RAM… but wasn’t it great to have a GUI & shell multitasking and raytracing (off a floppy!) when the IBM-PC world was stuck in barbaric DOS/Win3.1
Back in 1993… Novell 3.1 server running on a 486 w/16MB RAM, 4GB HD!
After sadly ditching my Amiga after 12 years, I got into BEOS, and alas I was happy (for a while)… I loved the 15 sec boot time! Greatest OS!
Win2003 server boots fairly quick too, but then again it would with dual 3.6GHz CPU’s and 4GB RAM!
Here’s a Windows 98 installation occupying 8MB of hard drive space: http://embeddingwindows.com/8mb_windows98.html
It seems that people want software that [in no particular order]:
– Is free as in Beer
– Is free as in speech
– Is full featured
– Is fast [loading & running]
– Is bug free
– Requires & consumes very little memory
– Is Really cool looking
And they want it now.
Have you checked out KDE’s DCOP?
http://www-106.ibm.com/developerworks/linux/library/l-dcop/index.ht…
That makes the desktop pretty scriptable… 🙂
Oh man, you *really* need to read http://www.joelonsoftware.com“>Joen .
Bloated? WHAT bloated?
– Software today does lots of things it didn’t 20 years agoç
– Hardware is soooo cheap
– Developer time is sooooo expensive (even in the FOSS world: just think of “scarce” instead of “expensive”)
It just does not make sense to waste a lot of time optimizing. Waste your time developing for systems like the ones you had in 1985 and you will finish like Lotus with 1-2-3.
The reasons for code bloat are typically not visible to the user. Cross platform libraries, OO programming, etc…
The scaffolding that is holding these large projects together is what is causing the bloat. You need the scaffolding because it is the work of disparate teams. There are better ways to do it, sure. But it is the develpment process and the code scaffolding required to make it work that is bloating our code.
This is even more the case in OSS because the people doing the programming are from all over the world, with different skill sets.
I use DSL (damn small linux) on a 512MB USB memory stick. I have plenty of room left on the stick, and it runs fast on just about any computer. It’s all a matter of pref. Some people what an OS with pretty colors, MP3 player integration, all sorts of GUI’s to make it easy to do things, other people want small, fast and do most of their work in a terminal window. The majority of the population want the bloated gui os, us geeks who want small and fast because we know what we’re doing in the minority.
Well, as for this article, althougt it was a bit extreme, I feel the same way. We sometimes loose sight of how much bloat there really is. I’d like to show you something, some of you have already seen it I imagine.
A current game right now may take anywhere from 500 MB or 800 MB or even over a GB for a full install. Most game demos run anywhere from 200-400 MB. What happens when somebody makes a goals for themselves, and tries to attain it? This is what: http://www.theprodukkt.com/kkrieger.html
Now, that game demo is 96k with the readme.txt. Remember when games used to be that small? Only this one has realtime-lighting and shadowing, colour blooms, bumpmapping, and all the other current video game features.
The people that made it had a goal and succeeded in making it. They could have just said, “Well, we’ll use this format to make the 3D models in, people have 120 GB HD’s now…they can handle it”, or maybe “Well, we have broadband, so if some people still use dial-up that isn’t our fault”. Why should we support that kind of thinking?
If anybody has ever tried MenuetOS, that OS with a GUI that fits on a floppy, they know what I’m talking about when I say that if people set a goal they can do miracles. In KKreiger’s case, they used scripts to generate the models and textures instead of saving the actual models temselves – innovation.
In my game, which uses OGRE, Blender, and other amazing software, I can only hope to keep a low size. We are trying our best to keep things small. 96k is a miracle, but 50MB isn’t outlandish. Why can’t we work together as a community and stop looking at the other’s mistakes to justify our own laziness and just do what should be done to show how wrong they are. (my game URL is http://finaldawn.innovatived.ca if you have any suggestions on how I could improve it, I’d be thrilled to hear them, e-mail: [email protected])
Debian and fluxbox will run fine in 64 meg of ram. Forget about Firefox, but dillo will get you around most places.
The author has two main points –
1) size — we used to do more with less. The cost of RAM and drive makes this a non-issue, fun only for nostalgia. (And he’s conveniently forgetting the other costs of those old systems: learning assembly, having non-portable code, runnable only by geeks, etc.)
2) innovation — this is the more interesting point. It even gets raised on GNOME’s developer lists and other places. Don’t just look to MS for something to clone, do something that really rocks. Ok, that’s a fine point.
Cars haven’t really changed all that much over the past 40 years either. But when I upgraded from a ’69 VW van to ’00 Toyota Corolla, I was blown away by things like how well the brakes work, how easy it is to reach 60mph, how nice it is that the handles all work, how I have airbags instead of my feet for collision defense. Yeah, wing windows rock and I miss them. But I’m not going back. I bet you aren’t either.
Ah, 1988 … I was a sophomore in high school.
We just got our 286 and had an amber monochrome monitor … and I hated that stupid computer with a white hot passion because my dad, while a genius, can’t teach worth a damn and nothing made sense.
Then in 1992 we got a 486 with 8 megs of ram and Windows 3.1 and at last computing wasn’t 510% sucktacular!
And, on one level, while this article is making me nostalgic to the point I’m going to go home to night and bust out my LCIII and my 486/100 laptop and revel in their uberfast boot times, I’m also going to come face to face with all the things that System 7.1 and Windows 3.1 don’t do, and weren’t designed to do.
Yes, there’s certainly code bloat, but we also expect things from our computers that we didn’t really dream of in 1988 or 1991 … MP3s, the internet (as we know it today), and DVD playback to name a few.
Did your Amiga have:
Anti aliased text and graphics across all widgets
Bidirectional language support
Internationalization
smooth scrolling
themes
Crisp SVG and PNG icons and widgets
Multilingual text rendering
21″ monitors with 1600 X 1200 and above resolutions
Sexy animations
DVD and CD players
search technologies
automatic hardware detection
support a million and one devices
video conferencing?
Get over it! While I agree that optimizing software should be a key element in the software development cycle, I would never give up the luxuries I have on GNOME today for a 1980 something Amiga, or a 1953 Punch Card Computer. Yeah, a punch card system needed less than 10 kilobyte of storage, but so what?
Here are some questions I have for you to ponder, answer or gloss over:
1. How small could we make a functional desktop with Microsoft Windows like features? Memory foot print and file size. It doesn’t have to have every bell and whistle. It just needs to be intuitive and usable.
Answer:
Amiga OS 4…. Filling about 60 megabyte after an complete installation.
2. What would be the real losses functionality wise if someone built an Office suite that only needed 10 Megs (This is completely Arbitrary) of disk space but inter operated file wise with Microsoft’s Office.
Answer:
The real loss, would be all the eye-candy, and the more special functions, like importing stuff, and of course the loss of all thos fonts wich are being installed by default.
3. Could we really accomplish much of what we do today on our UNIX desktops on a machine that only had a 486 and 32 megs of ram?
Answer:
No! Because stuff like Instant messaging, video editing and music playback would be impossible…. It is only the speed of the owerall os-system, and the size of the program (in megabyte) that we can and need to change….
4. Are these desktops and applications huge because we have the room or it couldn’t be done with smaller, better code?
Answer:
It is on the brich of success… Look at Amiga-OS4.
The real answer would be, that proghrammers today are lazy.
They look at the space and the power that modern computers have, and they are just filling the code with eyecandy plus stupid and un-optimized code…. Shame on them 🙁
5. Would we see any real world advantage by making the desktop, applications, toolkits more compact an efficient? Understand real world encompasses development time too. So if it took 3 months to code something tight that someone could write in their huge code in 2 weeks. Clearly we’d be looking at a disadvantage.
Answer:
By building an tight code, u could offer an better system, and if people had the option of trying the system or application, they would be amazed by its speed and/or responce time…. Clearly it would pay off in the long run/term (whatever…..) 😉
Please understand this: These are my opinions and reflections… And sorry for any mis-spelling….
I would recommend Davon to try Syllable.
http://syllable.sourceforge.net/
It’s small, clean and fast.
Anyway, I never really understand the problem people have with ‘code bloat’.
For example, Gnome fans not wanting KDE/QT to be installed on their machine. What do you think it’s going to do, start up behind your back and eat memory and disk space while you are not looking? Do apps and libs installed, but not being used, somehow slow the machine down?
The answer is: of course they don’t.
I always go for the full install on a Linux distro. A few gig of disk space is nothing, my largesse is infinite.
But I can understand why people (well Linux users at least) dislike ‘bloat’, and here’s why:
An experienced Linux user knows their machine.
They have a mental map of the filesystem, the daemons that init will start, the libs and their dependencies, the modules that the kernel will load and the location of any local application repositorys. As the distro ages, however, and programs are installed and removed, it picks up cruft and complexity, and that mental map begins to disintegrate. The computer’s filesystem becomes too complex to grasp.
Now, some people really don’t care about this. Others identify their mental map of the computer with their own mental state, and it’s slow disintegration upsets them deeply. The only cure is a purging of bloat, a reinstall, and the warm feeling of being ‘in control’ of the computer will return.
Oops. Too much coffee. I’m not trolling though, computers are a human creation and reflect our needs and ideals, and identification is an important part of that.
I am all for a system that works with in a small foot print be it application or OS. It will come from OSS for embedded/mobile system. Even Windows CE is small when you consider its roots. What amazes me is that success doesn’t find its way back to desktop or server environment. If embedded system can get OS to access Internet do all thing we do, email, browse and chat. Why is it that I could just take it loading it on 500mhz machine with 128mb.
BEOS did great thing with media and much slower and less memory capable machines. But it likely that PalmOne will ever do anything at desktop level.
May be the solution can come from the compiler. A smart compiler with better IDE tool could encourage better reuse of the coding. Optimizing even the most bloat of repetitive processes.
One, more recently then 1998 Operating systems took very little disk space. The original NT4 took very little disk space, and also only used around 16-32 mb of ram. Lean indeed.
ReactOS ( http://www.reactos.com ) is an attempt to write a windows compatible operating system. It currently consumes less than 50 MB of space when installed. Goals are to keep the system as lean as possible.
In order for the fat to be trimmed from linux, we need to replace X + ALL desktop environments…possibly with some type of kernel windowing architecture. While you may say that’s bad, i don’t necessarily agree. This is the 21st century folks. Find me someone (outside of a hardcore geek) that doesn’t have a graphical desktop. Yeah, console linux has it’s uses…that’s why you keep things modular. We at 667Studios might be starting just such a project in the near future.
Disclaimer: I am a ROS developer (though i don’t do much these days due to time constraints and code burnout)
Did the Amiga had…
>smooth scrolling
Are you kidding ?
>automatic hardware detection
Are you kidding ? (BIS)
>Internationalization
Are you kidding ?
Yes, Amiga had these 3 things (and a millions and TWO others) long before any other desktop computer/OS…
That said, I agree with you that I wouldn’t give up today’s luxaries for it. But I don’t think it’s the point. The point of this article (as I see it), is to wonder wether it’s possible or not to have a modern operating system (with your so beloved luxaries) but without the need of so many resources as Linux/OSX/Win do (yes, no matter which one eats the most resources, all require HUGE resources).
Btw, FYI: the automatic hardware detection was called “AutoConfig” and detected automatically hardware such as: SCSI controllers,… and I mean “plug and play”, not “plug…bzzzz… hardware detected… please insert driver CD”.
Amiga had hardware for smooth scrollings.
Amiga could also display several resolutions on the same (physical) screen with the copper, something todays polygons gfx boards can’t do and fast processors can hardly emulate…
Leo.
I regretfully admit that I did not know about KDE’s DCOP, but that does give me some hope. That people are at least thinking about and implementing desktop scripting.
It’s great to remember the good ol’ days. But don’t forget that people have to get work done. I amused myself for countless hours with my C64 and 386SX growing up, BBSing, trying to program, doing simple graphics and raytracing. But that was it, I was amusing myself and learning things, I wasn’t doing the kind of work people get paid for.
Lots of people in pretty much any job back then didn’t use PCs: car designers, graphic artists and printers, police officers, salesmen … all they were for was accounting and word processing.
There wasn’t any Photoshop, Illustrator 88 didn’t really do color exactly, there wasn’t 1/1000th of the breadth of specialized apps we have today … all you could really *use* a personal computer for was word processing, Lotus 1-2-3 and corporate data entry apps. Movie special effects, newspaper reporters, the phone company — they all either didn’t use computers or they used some weird proprietary system involving dumb terminals and a mainframe. Something else we all take for granted these days is, uh, networking. Sure I had an e-mail address and access to gopher in like 1992 via dial-up to a local provider’s ASCII system. But Windows didn’t even support TCP/IP natively until Windows 95, as anyone who ever set up a 10-base-T card using DOS drivers will remember. Appletalk came close to today’s P2P LAN environments, but it worked by connecting these funny cables, and it was SO SLOW, even when you were only moving 10K Word 5.1 files around. Slower-than-sneakernet slow.
And the 2 days it would take my 386 to render a simple 640×480 anti-aliased raytrace of a simple scene? I might look back fondly on it now, but back then it just sucked, and I couldn’t play any games or do anything else while it churned away. When I got a 486 and the same thing suddenly only took 30 minutes I almost died I was so happy.
No doubt about it, the C64, Atari ST and Amiga were MHz for MHz and MB for MB far better machines with far better software than anything made today. But try writing a stable modern OS like Windows 2000 in x86 assembly code and see how far you get!
Back when I was in primary school (I’m from Quebec), I had an old Tandy 1000SX lurking in my room. I just loved playing King Quest II on it. No HD, 640 kb ram, CGA monitor with 16 colors, DOS 3.20 (freezing times to times), a keyboard and a joystick. I was happy.
I remember back in the early 90s I read a review of NeXTSTEP for Intel. The main criticism was that it was too “bloated”, mainly because it needed 16Mb of RAM to run well. It’s ironic that today it’d be considered extremely light weight, especially compared with it’s direct descendant Mac OS X.
Do much larger modern operating systems really offer that much more? It’s not like NeXTSTEP was an ultra simple OS without a GUI or networking.
I don’t have much to say as far as the code goes but I do know that as a user I was fare more satisfied with my computing experience on an old 486DX4 w/ Windows 3.1 and DOS 6.22.
I will have to nitpick – Windows was not an OS at that point – it was a SHELL. Just a purdy collection of icons to aid in computer newbies who didn’t want to learn rhe nuts and bolts of a CLI.
In fact I will go as far as to say that this was the case (aside from the NT variant of Windows) until Windows 2000/XP. All other previous iterations of Windows STILL had DOS as the core component.
Meh. Ah well. As may be, I will never see why Microsoft decided to changee anything in the GUI with W95 – a Win3.1 box did everything I needed at the time and did it very well.
“They have a mental map of the filesystem, the daemons that init will start, the libs and their dependencies, the modules that the kernel will load and the location of any local application repositorys. As the distro ages, however, and programs are installed and removed, it picks up cruft and complexity, and that mental map begins to disintegrate. The computer’s filesystem becomes too complex to grasp.
Now, some people really don’t care about this. Others identify their mental map of the computer with their own mental state, and it’s slow disintegration upsets them deeply. The only cure is a purging of bloat, a reinstall, and the warm feeling of being ‘in control’ of the computer will return. ”
Good points, it’s just that some of them think that Linux is only for uber-geeks. They don’t realize some people have better things to do than spending their time studying computer manuals, don’t have a degree in CS, etc.
The industry followed the road of integrating more and more features into our machines. If it weren’t for that, we wouldn’t have anti-aliasing, themes, DVD playback, or many of the other features which have already been mentioned. There has been a price to achieving this goal so quickly, and that manifests itself as bloat.
But here’s the thing: while the industry has been chasing the holy grail of convergence, they ignored the people who don’t care about it. If I had the choice between fuzzy anti-aliased text and a crisp 300 dpi display, I would choose the 300 dpi display anyday. If I had the choice between featureful bloatware and replacing hard drives with solid state memory, I would take solid state memory any day. If I had the choice between a system which responds instantly and one with more features, I would take the system which responds instantly. And the list can go on.
There isn’t one true direction to progress, and some of us would have rather seen the industry go in those other directions. And it would have been easier to build small, quiet, and energy efficient machines if we didn’t have to deal with software bloat.
” (And he’s conveniently forgetting the other costs of those old systems: learning assembly, having non-portable code, runnable only by geeks, etc.)”
Is it really that much better: learning insanely complex APIs, optimizing/adapting for updated versions of components and layers, runnable only by geeks.
Yes, Windows machines are only runnable by geeks. Before you disagree just consider how much malware is on the average machine. Same with unix-variants except perhaps OS X. At least my mom gets by with that.
That said I have AmigaOS 4 running on AmigaOne here. It’s far faster than any other machine I’ve tried and it gets the job done. No, it’s not ready for the consumer market, but I’m sure as hell having more fun with it than any other OS.
Bingo!
I completely agree with you that weirdly-capitalized-NeXTSTEP is by today’s standard what should be emulated in terms of tightness, flexibility, lightweightness (is that a word in english? I’m from Quebec too like that other guy above..), FUNCTIONALITY, and solid architecture. The object model is solid enough that it’s actually not dependent on a single graphic display engine, unlike GNOME/KDE/etc who are still dependant on X11.
It’s true that the current needs of computer users couldn’t be served that well by a 640k PC, provided that you need at least to be able to interact with users having more recent OSes, but one thing that has been lost in the hardware performance expansion is the tightness of code. Embedded people still live in the age of 640k, 512k, 32k… and they are among the smartest coders in existence for being able to squeeze every bit out of their systems. In the so-called olden days, there _was_ a macho pride in doing much with less (remember the Demo scene), but the ability to do great tight OSes/application has vanished from the mainstream.
I use Mac OS X at home and GNUSTEP on my Slackware machine at work, and I dream of having a world united in the OPENSTEP specification. The original architecture has been extended, but not patched to death and forever hacked like Windows. Which proves one thing: the OPENSTEP developer got it right, and they are still right. You can have a TCP/IP stack, a solid OO GUI, consistent UI, clear interfaces, flexible object model (see what Apple is doing with it right now!), cross-plateform applications, simple development, and easy dynamic library management, but NOT at the expense of tomorrow’s computing power.
Yes, the original NeXT OS was demanding cutting-edge hardware at the time, but nowadays it does not. Windows, on the other hand, is still surfing on the upgrade wave, and the Linux GUI plateforms are making that error as well.
Honestly, if we could abandon the X11 implementation we have now once and for all, and relegate it to a compatibility layer outside of the core needs for a modern desktop OS, by following the *STEP model we could capitalize on a great architecture that is still relevant 10years later, and that WILL be relevant in 10years from now. For whatever goodness there is in the X11 protocol, there is a comparable badness in the X11 implementation. And please, coding a GUI in C is not useful anymore: it sadly leads to more bloat than tightness.
I’m saddened that even after reading stuff like the “X Window disaster” chapter in the Unix-haters handbook, which was written in the early ’90s, no one has still taken care of leveraging a better display model, because that is perhaps the single most important cause of bloat and failure in the free *NIX desktops in 2005.
“Did your Amiga have:
Anti aliased text and graphics across all widgets ”
It does now.
“Bidirectional language support”
Afraid not.
“Internationalization”
Certainly. Modular too. Switch on the fly.
“smooth scrolling ”
Absolutely
“themes ”
To some extent. In AmigaOS 4 it’s at the same level as e.g. XP.
“Crisp SVG and PNG icons and widgets ”
Since those weren’t around at the time: duh, no. PNG icons are there now.
“Multilingual text rendering ”
Not sure what you mean specifically.
“21” monitors with 1600 X 1200 and above resolutions ”
If you wanted it.
“Sexy animations ”
Yes. And lots of them. Thankfully none were in the graphical user interface where you do your work.
“DVD and CD players ”
CD players – sure. DVD players had to be invented first. Now they are supported.
“search technologies ”
Yes.
“automatic hardware detection ”
True plug’n’play.
“support a million and one devices ”
Thankfully not. But enough.
“video conferencing? ”
Certainly.
BeOS. An OS with all the features you could ask for in 1999 that was more responsive than present-day OSes on hardware 1/3 as powerful. GoBeProductive is a nice office suite that’s tiny compared to Office and OO and contains some features they don’t have. I’m still waiting for the day when something new comes close to BeOS’ elegance and user-friendliness.
Some ideas I have why it may be happening:
1. More programmers with less experience. Too many Java undergrads, I presume. Add to this the lack of training in good design practices (although I doubt it was any better in the past). Anyone can write a program that does X, but how many can design well factored code and get it comfortably under 100 KiB?
2. Loads of standards to implement before your system gets to a basic level of accepted functionality and compatibility.
See: http://www.cs.bell-labs.com/who/rob/utah2000.pdf
3. GUIs are HARD. The design and use of them (extending to the application) is still very much problematic. The modern approach is to create massive monolithic class libraries – not good.
4. Open Source – many more developers (but not enough good ones), poor communication medium (Internet), varying skillset, lack of focus and big egos mean that the result of open source projects is hardly an improvement over the commercial approach. In reality, the small systems you want are only created by a small, tightly integrated team of developers.
5. Acceptability. “Wow, it’s only 10 Megabytes?!”
6. Lack of time to think things through and re-design when necessary.
7. Still too hard to share code. It’s less hassle to roll your own.
8. Systems and applications these days do so much more than they ever did. It’s arguable that most of these features are not needed or wanted.
I’ve been experimenting with getting my code as lean as possible. For example I have a very fast memory allocator (written in C) – 2.26 KiB relocatable object code with symbols. I’ll release it eventually.
My most recent foray is a system-neutral GUI interface library which attempts to take maximum advantage of the native GUI system available. Currently the Win32 version supports windows, buttons, listboxes, check boxes, radio buttons, OpenGL windows, full-screen windows, GDI drawing, various common dialogs, etc. Code size is 15 KiB.
It is far from feature-complete and uses a lot of built-in functionality, but is much easier to use than straight Win32. My goal is reasonable functionality for most applications within 50KiB.
1. How small could we make a functional desktop with Microsoft Windows like features? Memory foot print and file size. It doesn’t have to have every bell and whistle. It just needs to be intuitive and usable.
If you copy bloat? You might just get bloat in return.
___________________________________________________________
2. What would be the real losses functionality wise if someone built an Office suite that only needed 10 Megs (This is completely Arbitrary) of disk space but inter operated file wise with Microsoft’s Office.
The majority of people either want or need compatibility. I would love to use an office suite that would only be 10 megs, either installed or running in memory. But the standard (sadly) is of a proprietary format. I would love to able to use OO.o or AbiWord. File formats and compatibility paramount to the exchange of data. This could be more of a relaity if standards were of an open nature. Look at TCP/IP, Open Standard, using a Mac, Linux, BSD or whatever, you can still communicate.
Keep in mind that an Open Format vs. Releasing Source code are very different.
___________________________________________________________
3. Could we really accomplish much of what we do today on our UNIX desktops on a machine that only had a 486 and 32 megs of ram?
I have no clue.
____________________________________________________________
4. Are these desktops and applications huge because we have the room or it couldn’t be done with smaller, better code?
More features = more code. Now, the question of efficiency and optimization. How about this:
1) What features does a person need.
2) Can it be streamed lined.
____________________________________________________________
5. Would we see any real world advantage by making the desktop, applications, toolkits more compact an efficient? Understand real world encompasses development time too. So if it took 3 months to code something tight that someone could write in their huge code in 2 weeks. Clearly we’d be looking at a disadvantage.
Developers, developers, developers.
Makes me remember those lovely little BBC Model Bs we had throughout our school. Fancy a game of “Granny’s Garden” anyone?
whoa, that description of mental map of the filesystem and slow disintegration made so much sense, I had never completely thought of it that way but it’s exactly what happens with me. wow!
I started using Linux back in ’94 and indeed it has changed quite a bit during the 11 years. I must admit that the OS I use today I enjoy much more than back then.
My point is in office applications. Though I know little of spreadsheets and presentation software, as an academic, I write a ton of papers. My wordprocessor of choice has been LyX from back in 98 to today. It is a much more innovative way of handling documents and writing vs. formatting text. I find it an indispensable tool in my arsenal. My wife uses it (on WinXP) for her papers, as does my next-door neighbor (on OSX), and other friends in graduate school (XP). They saw the output, they realized how insanely powerful the system was, and they wanted to start using it themselves.
OpenOffice is basically just a free copy. True. Very useful (I never leave home without it), but there are people innovating out there. They just don’t get a lot of press.
We used to play Podd (the red blob thing)!!!
Here’s an interview with GG’s programmer (includes screenshots!):
http://www.redkeyreddoor.com/index.php?p=14
People seem to forget exactly how slow old computers were. its real easy to say todays computers are bloated when you compare it to say a C64 running GEOS. But when you compare performance of todays systems to ones 20 years ago at the same tasks any though of bloat will go away.
lets take a graphical word processor even the lowest end of modern computers will seem fast and leen side by side to the one in GEOS as you wait for the letters to pop up as you type or the screen redraw as you scroll
You think well why dont they make programs today that could fit in 32k of useable ram? Well they could but it wouldnt be anyfaster compared to MS word and it wouldn’t have 1/100000 of the functions so why bother to spend the time to do it.
in all reality programs today are not neer as bloated as it seems. Even on the smallest harddrives around today You’d be hard pressed to fill it up with programs alown when 10 years ago you realy had to watch what you installed.
It all comes down to “why?” . Sure I could find programs written that only need a few megs of ram to run, but why bother when I can get anything and my system will have no problems running its not like im hurting for harddrive space or ram. Why should developers restrict ram useage when 256M+ is the norm
why not look at
http://www.aros.org
http://www.aros-max.co.uk
This is an amiga based OS, its very bloat-free and its OSS.
So, its
1. not bloated.
2. Free, and open.
3. Modern and in current development
4. Runs on standard PC hardware no Apple or PPC expensive-ware
5. Amiga-like for those diehard amiga fans.
I think this proves the point, an OS can be small, powerfull, easy to use and still free. Now, why not support them by giving some money or programming time to the project?
Here I am. A k6-2@300 MHz, 2GB HD, 256 MB RAM. Ubuntu Hoary (kernel 6.8.10-386), Xorg 6.8.1 (I am using the VESA driver at 640×480 24bpp since this monitor will crap out on average refresh rates), XFCE4, Firefox 1.0 with 4 tabs, apache2+php4+mysql running as a server, the sshd daemon, 3 xterms open and 2 console sessions. Also, I installed java 1.5.0 jdk.
HD: 872 MB used, 1.1 GB free.
RAM: 233 MB used, of those 124 are cache, and the rest is actually used. No swap used at all.
Load average: 0.63 (heavily browsing css pages with javascript).
uptime: 19 hours 38 minutes.
I can do remote X sessions to here using ssh -X (runs nice in a 10 mbps network). I can update from the net, on a 10mbps network it downloads from the ubuntu archives at like 400 Kbytes/s while the rest of the system runs happily.
Sorting processes by memory in top, the highest user is firefox, with up to 18.5% of memory.
Hell, I have a computer at home, an AMD 486@133MHz 32 MB RAM with a debian woody, compiled 2.4 kernel that can run blackbox on X, compile with gcc and play mp3s with mpg123 with no sound skips.
Do you really think the situation is that bad?
Unfortunately, it is still old, well known problem. In 1995, Niklaus Wirth wrote his famous “A Plea for Lean Software” article (http://www.cr.yp.to/bib/1995/wirth.pdf).
In the present day world, where marketing “rulez”, lean software is probably impossible.
But Wirth’s article is still worth of reading…
Mark
Incorrect link. Should be:
http://cr.yp.to/bib/1995/wirth.pdf
Sorry.
Mark
It’s true that projects like MenuetOS, written in assembler, sport with tiny memory footprint, are blazingly fast and whatnot, but the average programmer doesn’t usually learn or even want to learn assembler. He (or she) wants to write in high-level language, preferably a scripting language or one utilizing virtual machines, so that once the code is written, it can be run everywhere. Think calling the taxi versus first buying the parts for a car, then building it and only after that actually drive where you originally wanted to go.
The point is, programmers are and always have been lazy. Now it’s possible, thanks to fast hardware, to write in very high level languages compared to the 1980s. Back then you could write programs for a DOS/Windows PC in assembler, Pascal or C, and that was about it. If you wanted to have a GUI and the OS didn’t provide it, you coded it yourself or didn’t use one. Now you’re given the option to choose whatever language you happen to like. If you’re given the task of programming a new word processor, you can do it in C, C++, Java, Perl, Python, Objective C, Visual Basic, Object Pascal, Scheme/Lisp, etc., not even talking about the different GUIs or frameworks (GTK, Qt, .NET/MONO, …). Because programmers are lazy and because there is a huge amount of code and libraries ready to be used, programmers take shortcuts and thus spawn bloat (usually).
Reusability is a two-edged sword (or three-edged, if you’re a Vorlon . On one hand it saves both time and money, enforces modular programming and lets the programmer concentrate on the task at hand and not worry about how to implement the stuff that’s beside the point. On the other hand, if your program uses a bloated library, your program is bloated. Complex libraries often have complex API’s, thus negating the time saved on using it; break it up to a set of simple libraries which don’t depend on each other and you might both reduce bloat and simplify the API. Then again, if you have n+1 small libraries each doing only one thing, you have bloat again…
Let’s all hope we run out of silicon next year so that we have to use the same hardware we have now for the next twenty years. Maybe that teaches us how to optimize code.
Read the subject.
Correction: Fabrice’s compiler is TCC not TTC. My apologies my fingers got away from me.
http://fabrice.bellard.free.fr
Thank you for your comments, ideas and suggestions. There’s a lot to look at and consider. I had thought AROS was actually only for the new Amiga hardware. My bad I’ll have to give it a look.
I guess I shouldn’t be surprised by how many people commented on the nostalgic aspect of my article. My focus had been to display some examples of how, given very limited resources. Amazing software and applications could be found. But it’s always nice to think back to those days.
Special thanks and honors should go out to those brave mad men and women who created the emulators that give us still a chance to relive the old days even though our hardware may have gone to digital heaven.
I was aware of Syllable which I haven’t checked in on since it Showed up on Slashdot being mentioned as AtheOs. An amazing effort indeed.
MinuetOs I have looked at a time or 3. Definitely something to keep my eye on. Curiously everytime I try to run it on Qemu it chews up every iota of cpu time and renders slowly.
Kkreiger is an amazing example of getting the most out of every byte of code. It took me a while to get the hardware needed to actually run this demo. But it blew me away.
Bloat: as referenced earlier. Mental map of your filesystem? I’m not really sure how that correlates with this discussion. Clearly you have a vast mind and I’m sure you feel a disturbance in the force when something erases a library.
Someone discussed how we couldn’t edit video or play back mp3’s with hardware like I mentioned as I wandered down memory lane.
The article was more about getting the most out of the hardware we have today. It seems to me much more is done by brute force instead of finesse. However I think it’s not quite all the developers fault. Time constraints and all asside.
With the Amiga (Yes I’m back to this magic box) you had specialized media chips and busses to perform feats that would have slain the poor 68000 processor if it had been left to do all that work.
You also had most all of the graphic interface and code imbedded in Roms so the software it self didn’t carry the added size and weight that a statically linked program would have.
On a more recent tangent, I used windows and a Pentium 200MMX and used it to watch DVD’s without so much as a glitch. Course I cheated, my DVD kit contained a Hollywood+ Mpeg decoder card. So here the software and os used again specialized hardware to accomplish what brute force coding couldn’t.
There are lots of balances and checks between hardware, software, application and OS design and development.
Embedded systems are amazing things. I haven’t really had the spare cash to try out some of the hardware but it does amaze me how much you can get away with if you’re clever. Picobsd for instance. Rocked my world when I tried it.
Booting a multitasking system off a single floppy that could route my data and do telnet and such. Was pretty freakin amazing to me after being away from my Amiga for so long. Still it wasn’t graphical and certainly didn’t play mp3’s. It did however give me hope that tight, functional code was still in the future of computing.
The responses to this article have given me lots to consider and when the discussion ends I’ll probably be spending a great deal of time going over the links and information generously given here. Thank you one and all.
Windows 3.1 is the primary OS of choice. And though it crashes at a blink, has a clunky interface and needs at least 4 MB of ram to run well.
Boy your memory is shot or you didn’t have much knowledge of windows back then. Windows 3.1 and 3.11 were well ahead of their time and ran flawlessly on top of DOS 5-6.2 even on my old POS 386sx with 1MB of RAM. The interface was fairly easy to use for its time.
I can’t comment on operating systems, but I am personally stunned that acrobat reader was 3 mb 4 years ago and it’s now 21 mb to install.
All I want to do is view PDF’s! I am not trying to take over the world!
I would also like to slap anyone who bundles a spyware ridden download manager with their product or who add anything to startup without my permission (Realplayer, Itunes included).
Talk about one of the most bloated software companies out there. Adobe products are slow and huge! It’s so funny to see that OSS software developers seem to do a much better job @ making a product then those you pay boocoo bucks for. You’d think for the price you paid for the product the better it would be. In most cases I’ve seen personally I prefer the OSS software over a pay product because of the community, size of final product, and documentation behind it. I may have dual 120GB HDD’s but for some reason I crave to use as little space as possible.
You are trolling, right? Windows 3.1 ahead of it’s time… hilarious!
It’s nice to be nostalgic.
Nothing wrong with that.
I sometimes think about the days of supporting 150 users on a single 33Mhz 68030 with “smart” interface cards with 32MB of RAM.
But I like my GUI painters, wizards, powerful database systems, sophisticated kernels, high level languages being interpreted and compiled on the fly.
I hate slow code as much as the next. I loathed Half Life 2 with its eternal loads and pauses, but boy was it a gorgeous game.
I scream and yell at Outlook Express whenever I go in to get my emails. But then I note that I have several 100 THOUSAND emails in my assorted folders, and I cut it some slack.
When my drive gets too full, I replace it and load on the old one. It’s easier than actually going through and cleaing stuff off of it, and gets me more space more quickly.
The overpowered bememoths we run every day let us worry less and less about them. I worry about performance and volume so rarely, it lets me write the code I need to get the job done much more quickly, and then lets me move on to other jobs. While the work is being done “less efficiently”, it nets being done faster than if it was done efficiently.
Only on edge cases is it worth the time to really worry about tweaking and speeding up the code.
I’m a bigot. My time is worth more than the machines time. Since most of my work is one off stuff, I’ll run the most bloated and engorged tools I can get my hands on if it saves me time in getting the job done.
“I will have to nitpick – Windows was not an OS at that point – it was a SHELL. Just a purdy collection of icons to aid in computer newbies who didn’t want to learn rhe nuts and bolts of a CLI.”
You are just plain wrong.
Windows 3.1 was a great platform for developing applications. Since its 1.0 version, Windows had a thing called GDI (Graphics Device Independence), which allowed programmers to use the same code to output graphics and text to screen and printers. Besides GDI, Windows 3.1 had other great APIs still used today, like MCI (multimedia), TAPI (Telephony and RAS), MAPI (Mail) and et cetera.
Before Windows, every PC (DOS) application had a bunch of display, printers, joystick and sound drivers. This sucks. Hardware was designed for the applications instead of the OS.
On the other side, since UNIX always lacked a decent GDI-Like API, Mozilla has an entire “PostScript generator” to be able to print in UNIX/Linux. This code is so completely fucking different from the code used to display HTML pages on screen that every printed page on UNIX looks different from the screen version.
Windows 3.1 was not just a SHELL.
I use Linux exclusively on my computers, BTW.
Back in the good ol’ Dos-win3.1 days I used a program called Tempest. It was a very quick GUI that kicked 3.1’s butt. I also used an “os” called GEOS and Geoworks. It’s still around as a program called Breadbox. Quick and small, it have (has) everything a home user might need……….I miss my NEC 386sx now!
I fell it is always best to apologize before you did anything – no one can say much of anything bad about that What I am actually apologizing about is I did not read all the 55 comments before posting my response and I am not nearly the techie many of you are – but maybe someone can help resolve something for me.
I do like a lean & mean OS, which implies I am becoming more & more dissatsified with Microf–k Winbl–s each new version of it they make. Regarding PC’s, I first started using MS-DOS – before that did college projects on at terminals connected to mainframes.
Yet I don’t know if the real problem is bloating per se or the things Windows does which a user has no control of. It always seems to be writing to a file, logging your activities, PSTORES.EXE opens every time I enter anything in a dialogue box (including this). After it is open, I find I have to wait to connect to web sites (with fast ethernet DSL), wait for files to open, etc. Same thing with SPOOL.EXE, which often pops up for no apparent reason (may have to do with the printer software – I discovered my Lexmark X74/75 includes spyware of sorts). The things that Windows does is not much different than the things that spyware does – it is like the cops of PC’s that are allowed to spy on you but everything else is spyware and a no-no.
So my point – yes I feel some programs are probably written very inefficiently and probably run a bit slower because of that (I only know FORTRAN 77 and write simple but very efficient scientific programs) – and yes there are MB’s upon MB’s of system files, drivers, etc. that sit on the hard disk and are never used. Yet they don’t get it the way and these do not seem to be the major problems. The major problem is it seems to always be doing some kind of crap with USER.DAT, INDEX.DAT, etc. files that are unneccessary. There always seems to be some program running which is unneccessary and slowing things. Are the newer Linux-based OS’s, FreeBSD, and other alternatives really any different in that respect (I refer to those with a reasonably user-friendly GUI and similar to some extent to Winblows so they are not such a pain in the ass to use).
The system I am typing on (laugh): Compaq Presario 5245 with AMD K2-6 at 400 MHz, 320 MB RAM, Windows 98. Don’t want to upgrade to any higher Windows – am considering going back to DOS
There’s your entertainment for the night
hi;
after reading this article, i could not help to eliminate the possibility that the author really frustrated with his computer system. as such, it use it to channel his frustration in a very honest way.
as of for, i do agree that the linux become bloat (i will not say windows because it is not nice to say bad thing to a retarded product). but the best part is, you can make it run with no bloatation.
here my story: at 2000, i grab a hold of Linux Suse 8.0. and install it on my pentium 100 with 32 meg memory. i run X with FVWM and it work very fast compare with Windows system. i’m very happy with the system.
so, fast forward to 2004. i have a pentium 4. i install it with mandrakelinux. the system, not as fast as before in 2000 but still i can do work stuff on it. and had no complain but the loading of the system that take a long time and the system slowed down when trying to run something.
maybe i’m just lazy but i know that i could make it faster but i do not want to try it because the box i use as a experiment on so many things that mandrake like to offer. i just let it run slow, as long i could try something new. the pentium 100 that i use is for my project assignment and other important stuff. i do not have time for anything else. until there is something very important that i would like to do on my pentium 4 box, i’ll bite downed all the unnecassary stuff.
or, maybe not. 🙂
I remember doing amazing things with Desqview under DOS. Yes, it wasn’t GUI. However it ran all the programs I needed to run plus a two line Fidonet BBS.
I remember seeing a demo of Desqview X at one of the big computer shows back around ’92 or ’93. It rocked (mostly because it was a DOS port of X Windows) but Quarterdeck dropped the ball and couldn’t get it out for 4 or 5 years. By then Windows95 had taken hold. Sometimes I wonder what would have happened if they could have gotten it out right away.
If you hate bloat, then use this pdf reader –
http://www.foxitsoftware.com/pdf/rd_intro.php