Computers are getting faster all the time, or so they tell us. But, in fact, the user experience of performance hasn’t improved much over the past 15 years. This article takes a look at where all the precious processor time and memory are going.
Computers are getting faster all the time, or so they tell us. But, in fact, the user experience of performance hasn’t improved much over the past 15 years. This article takes a look at where all the precious processor time and memory are going.
Nothing new, really, but well worth the short read. I especially like the action item and his rant on program complexity.
Short: Nice!
That as soon as the “Usibility Experts” started showing up on the scene,the performance of computers in terms of software and hardware went downhill?
Take a look at what you could do on an first or second gen Mac ,Amiga 1000 or an Atari ST and compare it to today’s machines.
Today’s sofware and hardware isn’t very impressive when all things are considered is it?
This week’s action item: Launch a few applications simultaneously and time their start-ups. Try it again in five years to see whether the time has improved.
Cool idea. Will do that.
My 40MHz Acorn computer is more responsive for many tasks than my Athlon 1800+ in Windows and Linux. However, the Acorn really crawls when rendering HTML and displaying JPEGs; so we are seeing raw performance increases that today we may take for granted. You can’t beat the raw throughput of the latest hardware.
Unfortunately, a lot of the systems in widespread use today are second systems.
Heh, Windows NT anyone? Longhorn is looking like a second second system 😉
…but users get used to faster computers.
just switch on your old computer, put the your old windows 3.1 and ms word, amipro or wordstar on it and tell me if it is really that fast. i bet you have the impression it got slower over the years. it’s really slow, you just couldn’t remember how slow it was.
computers gets faster faster than software gets slower…
Please use it when publishing things, even in forums. Is it too much to ask?
He is in many ways right, but I think some of the gains have been worth the performance losses. Personally though, I find my athlon 1800 every bit as responsive as the day I bought it, and it’s much better than the slower machines I work on.
…but users get used to faster computers.
Have you seen the NeXT video with Steve Jobs in it that was posted recently-ish? He drags the window around saying how fast it is, which made me laugh out loud.
My 233mmx running BeOS keeps up very nicly with my 1.4 ghz athlon linux box. Well that is untill I launch firefox, luckily there is netpositive.
About 10 years ago I remember people complaining that Microsoft Word was too slow on the Mac. You could type faster than the processor handled input on such a large application. Imagine my disappointment when I recently discovered that the same thing still holds true.
Well, I don’t know what type of machine this bloke is using, but I know for a fact that a 366Mhz Celeron w/256MB of RAM running Windows XP and Office 2003 is quite capable of keeping up with a “fast typist”.
Methinks his computers are very broken.
Hell, I sit in front of a ca. 1999 dual Pentium 3 system running Windows 2003 with a fairly heavy load every day and very rarely do I consider my system to be “slow”.
It depends on what your references are: if you ask here, you’ll hear that BeOS booted very fast and was much more responsive than other OS are on much faster HW.
So being slow or fast depends on what you’re used to!
As for Office, maybe it depends on the setup of Office (the article discuss about automatic correction and the like)?
Or more likely on the strange way Office treats input: even typing slowly, sometimes the display is not refreshed for a little while then it is updated, no character are lost, but it is pretty annoying..
When I compare my experiances with my Commodore 64 (~1 MHz) and Amiga 500 (~7 MHz) computers there is a really noticeable thing compared to the modern 2 GHz monster computers. But from user perspective it’s not performance but noise!
Using the old computers you hardly heard any other noise than floppy disk spinning. But with modern hardware user experiance is completely different. It’s like using a hair dryer or listening an airplane takeoff near local airport. Nowadays your wife/girlfriend no longer complains only about time spent with the amiga but also the noise she makes.
BTW, the Commodore 64 booted in about two seconds. Today even my mobile phone boots longer…
I was playing around with my Amiga 3000 (25MHz) last year comparing it to my BeOS machine (Dual 366Mhz).
First, thing I notice that it was not as snappy as I remember it being, but not bad considering the diffirence in CPU power.
Then I tried multitasking. Big diffirence, unless I was running a very simple program, more than three copies running caused a noticable slowdown. Ofcourse in the old day just being able to switch to another program to solve a problem and still have the old programs crunching away was great. But my BeOS does not seem to slow down atall unless I am run 12-20 programs at the same time.
Unless I run Firefox, Firefox suchs CPU power. If I run light programs with it not problem, but don’t expect to power crunch if Firefox is also running.
But the other thing I notices is the size of my data files. On my Amiga few files are greater than 100K, but on my BeOS machice 2-10 meg files are common. And that seems to make the big diffirence.
Most of my old Amiga pictures are 320*240 or 640*480, and in IFF format only 5-6 bits deep. On my BeOS machine 1024*768 and 1200*1024 are common and 24bits depth is standard. If I transfer these picture in GIF or JPEG format to my Amiga they take ten or more seconds to display – the worse (largest) easyly taking a minute to come up. In comparison almost all of them come up in a flash on my BeOS machine.
The same for many other files. The few PDFs on my Amiga are all less than 50K in size, on my BeOS machine the largest is 33 Meg.
Sound files on my Amiga again in WAV or IFF format that does not need much power to decode. On my BeOS machine almost all MP3 which would tax the old machine to decode in realtime (I think there was a mp3 player for the Amiga or was it a convertor).
Some a major diffirence to me is the size of files being processed today.
Methinks his computers are very broken.
You think everyone who complains thier “Windows PC is slow” has a broken computer. Microsoft can do no wrong, only users can.
Hell, I sit in front of a ca. 1999 dual Pentium 3 system running Windows 2003 with a fairly heavy load every day and very rarely do I consider my system to be “slow”.
I don’t find MacOS X on my computer slow either, but you’ve argued otherwise.
According to you your perception is reality and everyone must conform to it, everyone elses is broken. You need to learn that UI performance and tolerance there of are relative.
Still an interesting theme for after dinner talk though.A great deal of power is being wasted when a Linux distro compiles everything for ancient hardware,aka i386.Why not raise the bar to let’s say i686? Well at least everything media and graphics related.Amazing how little info you can find about optimalisation in the sence of setting gcc FLAGS,
build options for most distributions.For example , for my debian box i could find one site that describes a “make world” like procedure for debian with apt-build.An “abt-build install <app> would download the src’s for the app (dependencies included) for you specified platform and uses them to build the app optimalized for your arch.For MS this isn’t relevant at all.Just like security,media and everything related is big buisiness on the MS platforms.The dev do everything to “innovate”/”improve” looks and features.You need increasingly more performance to keep your current performance experience on the same user scale.New threat,results in new apps with an potentional solution.New successor OS comes inherently most of the times together with increased hardware requirements.Personally i see it as an everlasting rat-race.Like cars,etc,just to make an anology.
I don’t think the apps haven’t changed and do think performance experience is coherent to the hardware cycle combined with the sofware cycle.For desktop/home/multimedia/whatever SOHO PC’s the apps are more feature rich and shining (nothing wrong with it though:-) than ever.
Unfortunately, a lot of the systems in widespread use today are second systems. Worse yet, the bloat introduced by a second-system design is often preserved in future revisions to preserve compatibility.
Thank god i’m able to compile everything relevant from src with the appropiate FLAGS and BUILD options.
‘No load no bloat’
You think everyone who complains thier “Windows PC is slow” has a broken computer.
No, I think people who say things like the 2.8Ghz P4 is slower than than 500Mhz G3 have broken computers.
Or people who say machines with Ghz+ processors and a gig of RAM can’t keep up with simple text input have broken computers.
I don’t find MacOS X on my computer slow either, but you’ve argued otherwise.
I can find a hell of a lot of people who agree with me as well.
Do you think you can find many people who think text input on the sort of machine he’s talking about is slow ? There’s a rather large thread about this article on /. as well and I don’t recall actually having read a single message yet that agrees with that possibility.
According to you your perception is reality and everyone must conform to it, everyone elses is broken. You need to learn that UI performance and tolerance there of are relative.
If someone’s Ghz+, gig of RAM machine can’t keep up with their typing, it’s either *severely* broken, or their name is Wally West. The mind boggles that you would consider such behaviour acceptable.
… but not totally, in my view.
Can’t ever compare to boot up times with DOS, but I do remember working my 386, P133, P400, P800 and PM 1.2. Although I never had any performance issues related to typing, I can recall differences in application startup times, spooling, and rendering; using corresponding OS and apps.
I have noticed improvements, but yeah, not at the leaps and bounds that I would expect with the hardware advancements.
Support code, shared libraries – they certainly use lots of memory. I’m an admirer of KDE, which is my main desktop. Clearly, KDE could not have reached the mature state that it now enjoys without having used the nested layers of support code that the author mentions, including the excellent Qt library; but it still comes as a surprise to see how big Qt and KDE are. It does not help the system’s responsiveness if one also runs applications that use other large libraries (e.g. Firefox, OpenOffice).
Here are the figures for a KDE app in Red Hat 9. These do not include KParts components that are loaded on request.
$ ls -HShs $(ldd $(which kdeinit)|cut -d ‘ ‘ -f 3)|cat
6.4M /usr/lib/qt-3.1/lib/libqt-mt.so.3
2.6M /usr/lib/libkio.so.4
2.3M /usr/lib/libkdeui.so.4
1.5M /usr/lib/libkdecore.so.4
1.5M /lib/tls/libc.so.6
892K /usr/X11R6/lib/libX11.so.6
700K /usr/lib/libstdc++.so.5
452K /usr/lib/libGL.so.1
328K /usr/X11R6/lib/libXt.so.6
324K /usr/lib/libfreetype.so.6
272K /usr/lib/libmng.so.1
244K /usr/lib/libkparts.so.2
212K /lib/tls/libm.so.6
192K /usr/lib/libDCOP.so.4
164K /usr/lib/libkdefx.so.4
148K /usr/lib/libfontconfig.so.1
144K /usr/lib/libpng12.so.0
132K /usr/lib/libexpat.so.0
124K /usr/lib/libjpeg.so.62
108K /lib/ld-linux.so.2
104K /usr/lib/libkdesu.so.4
92K /usr/X11R6/lib/libXmu.so.6
92K /usr/lib/libart_lgpl_2.so.2
88K /usr/X11R6/lib/libICE.so.6
84K /lib/tls/libpthread.so.0
80K /lib/libresolv.so.2
76K /usr/X11R6/lib/libXft.so.2
60K /usr/X11R6/lib/libXext.so.6
56K /usr/lib/libz.so.1
32K /usr/X11R6/lib/libSM.so.6
32K /lib/libgcc_s.so.1
28K /usr/X11R6/lib/libXrender.so.1
16K /lib/libdl.so.2
16K /lib/libutil.so.1
>
Take a look at what you could do on an first or second gen Mac ,Amiga 1000 or an Atari ST and compare it to today’s machines.
<
I saw a NeXT cube demo in 1988. It seemed to run fast with a 25mhz processor. Windows and graphics were especially fast. Very usable, practically every feature you would expect on a modern GUI. Of course, the OS, was based on BSD.
What I don’t get is: nearly 20 years later, why is the gui environment so sluggish on Linux?
I can find a hell of a lot of people who agree with me as well.
I can find more that agree with me.
Do you think you can find many people who think text input on the sort of machine he’s talking about is slow ? There’s a rather large thread about this article on /. as well and I don’t recall actually having read a single message yet that agrees with that possibility.
Do you know how to read??? do you posses basic compreshension skills?
He is making a relative comparison, not a literal one. I read the same /. thread and I don’t recall anyone having your view that his machine was broken either. Actually the general trend on /. seems to agree with the author.
If someone’s Ghz+, gig of RAM machine can’t keep up with their typing, it’s either *severely* broken, or their name is Wally West. The mind boggles that you would consider such behaviour acceptable.
Let’s say said GHz+ is constantly thrashing, it will hiccup while typing. Namely attibuted to the bloat that the author is talking about. It is obvious that in such a case more RAM would fix it. But the authors point is why have system requirements gotten to be so high over the years for simple things like email and word processing. Even if said machine got a boost in RAM, it would be inadequate for the general set of Apps in a year or two.
Take OS X and soon to come longhorn. For all the graphics they use just for the user experience they need graphics card that yesterday only high end gmaes required. Longhorn was said to need a minimum of 64MB and atleast 128MB graphics memory to work well.
All the Atuhor is saying is Apps are getting more complex as the days go on and current hardware trends and software complexity at any given time make for a similar user experience in terms of subjective performance metrics.
For example, when I got my athlon XP 1700+ two – three years ago, apps on windows 2000 would pop open quickly, now mozilla and similar apps take a few seconds and the difference is very noticeable.
if someone’s Ghz+, gig of RAM machine can’t keep up with their typing,
Where in the article did the author quote the specs of his machine were GHz+ and a Gig of Ram? Did you just pull those numbers out of your ass, like you usually do?
Most users including myself have 512MB or below on thier PCs. My workstation at work has dual cpus and 4GB of ram and depending on my load 4Gb isn’t adequate sometimes.
Apps like Mozilla grow to 550+ MB RSS given a few weeks with 30000+ IMAp email headers loaded. Of course this is on Solaris.
you’ve only got yourself to blame for buying cheap hardware. my Athlon 2500+ system is overclocked to around 2.2GHz and runs noticeably quieter than my Playstation 2; it’s only audible from about a foot away. Use good cases, good heatsinks and good fans and you can make modern PCs very quiet indeed.
When an average user tells me that their computer is slow I take a look at their PC and it’s crawling with adware/spyware. I’ve seen 3GHz systems end up slower than a 500MHz just from spyware.
I tend to think of this as the ‘hidden box’ problem.
A function/library has working code, it has had working code for a long time, the code has no bugs. People use the library like a black box, relying on the library to provide key functions for their applications.
Problem is that the nobody ever goes back to look at the library code and because nobody ever looks at the library code it never gets optimised, it’s just sitting in development limbo. So developers can optimise the hell out of their current code, but since most of the time is spent in these hidden boxes they don’t actually make many gains.
Good examples would be things like Glibc, the Gnome libraries etc. It’s not that the developers are lazy, they’re just too busy bug-fixing and working on new features to go back and optimise old code that is working even if it is inefficient.
I can find more that agree with me.
That high end machines of five years ago can’t keep up with a typist ?
Do you know how to read??? do you posses basic compreshension skills?
Yes.
He is making a relative comparison, not a literal one. I read the same /. thread and I don’t recall anyone having your view that his machine was broken either. Actually the general trend on /. seems to agree with the author.
“About 10 years ago I remember people complaining that Microsoft Word was too slow on the Mac. You could type faster than the processor handled input on such a large application. Imagine my disappointment when I recently discovered that the same thing still holds true.”
I find this assertion hard to swallow. It is the only point of contention I have raised thus far in this discussion. Since you’ve attacked me for making that criticism, I can only assume you disagree with it.
I have not disagreed with the general theme of his article, although I do think he is exaggerating the problems, dramatically understating the improvements in modern software and looking back with rather rose-tinted glasses.
I’ve got a good mix of machines at home that I regularly use. I couldn’t *dream* of running the workload I do now on a machine ten years old, let alone fifteen (and I have examples of both high end and low end machines from those eras).
Let’s say said GHz+ is constantly thrashing, it will hiccup while typing.
I’ll admit I was assuming a basic level of journalistic integrity and that he wouldn’t have deliberately set up a benchmark to fail just so he could write a column about it.
Namely attibuted to the bloat that the author is talking about.
The author isn’t talking about bloat, he’s talking about more modern software. Added to that, bloat isn’t going to cause stuttering in text input on that class of machine – unless it’s being *absolutely hammered* by something else (which hardly makes for a fair criticism).
But the authors point is why have system requirements gotten to be so high over the years for simple things like email and word processing.
Because they do a lot more.
You can get acceptable performance for basic word processing, email and web tasks, even using current OSes and software, on hardware over 5 years old.
Even if said machine got a boost in RAM, it would be inadequate for the general set of Apps in a year or two.
I doubt that. A comfortable level of RAM for typical use has been pegged at the 256MB – 512MB mark for several years now. I can’t see a gig of RAM being “not enough” in much under five years for a typical user.
All the Atuhor is saying is Apps are getting more complex as the days go on and current hardware trends and software complexity at any given time make for a similar user experience in terms of subjective performance metrics.
Something I haven’t actually disagreed with.
Of course, something the author seems to have missed is that one of the main reasons the system doesn’t seem to get faster is because most of the time the bottleneck is the person using it.
Where in the article did the author quote the specs of his machine were GHz+ and a Gig of Ram? Did you just pull those numbers out of your ass, like you usually do?
“How is it possible that a machine with a full gigabyte of memory can run out of room to run applications just as quickly as a machine with six megabytes of memory did 15 years ago?”
“The worst days of this trend seems to be behind us now: most word processing programs started to keep up with even good typists somewhere around the 1-Ghz clock-speed mark.”
Because they do a lot more.
Gee. Wasn’t that the whole point of the article.
You can get acceptable performance for basic word processing, email and web tasks, even using current OSes and software, on hardware over 5 years old.
Define acceptable. You don’t find OS Xs performance acceptable, I do. Get the point.
Added to that, bloat isn’t going to cause stuttering in text input on that class of machine – unless it’s being *absolutely hammered* by something else (which hardly makes for a fair criticism).
The author’s comment is on word processor performance not text input in general. He contends that all the automated sysems like spell checkers and clippy there sucked up all the extra cycles. One might say that the little annoying office assistant is bloat.
I doubt that. A comfortable level of RAM for typical use has been pegged at the 256MB – 512MB mark for several years now. I can’t see a gig of RAM being “not enough” in much under five years for a typical user.
Almost everyone’s predicitions of future RAM requirements have always proven wrong.
Of course, something the author seems to have missed is that one of the main reasons the system doesn’t seem to get faster is because most of the time the bottleneck is the person using it.
What???? The user is the bottleneck?? Oh like how I said a few articles ago while we had that whole OS X performance debate, that you were the bottleneck and not OS X. So you tend to agree with me, thanks.
.
“The worst days of this trend seems to be behind us now: most word processing programs started to keep up with even good typists somewhere around the 1-Ghz clock-speed mark.”
Why did the you leave the subsequent line that gave the above statement context?
These days, it’s the automatic features on these programs that can slow down your system.
Gee. Wasn’t that the whole point of the article.
Sort of. The point of the article seemed to be saying the computers aren’t really doing much more at all from a user perspective.
Define acceptable. You don’t find OS Xs performance acceptable, I do. Get the point.
Well, then, a 366 Celeron running XP is going to fly.
“Acceptable”, in this circumstance, is meeting the requirement I was talking about – namely keeping up with a typist.
The author’s comment is on word processor performance not text input in general.
“About 10 years ago I remember people complaining that Microsoft Word was too slow on the Mac. You could type faster than the processor handled input on such a large application. Imagine my disappointment when I recently discovered that the same thing still holds true.”
“Recently” implies within at least the last few years. If a machine that was current within the last few years (which would be faster than the 1Ghz machine mentioned later) can’t keep up with text entry, it’s broken.
He contends that all the automated sysems like spell checkers and clippy there sucked up all the extra cycles. One might say that the little annoying office assistant is bloat.
One might, if there weren’t people who find it useful.
The point I’m trying to make is that even *with* all that rubbish going on in the background, if a 1Ghz, 1GB machine can’t keep up with a typist, it’s broken.
Almost everyone’s predicitions of future RAM requirements have always proven wrong.
That depends on how far ahead they’ve been predicting.
What???? The user is the bottleneck??
Most of the time.
Oh like how I said a few articles ago while we had that whole OS X performance debate, that you were the bottleneck and not OS X. So you tend to agree with me, thanks.
No.
Why did the you leave the subsequent line that gave the above statement context?
Because they don’t. Word processors – even with spelling, grammar, etc checkers running in the background – have been able to keep up with fasts typists since long before 1Ghz machines hit the market. Additionally, that’s with *today’s* most recent OS & software – go back to software that was current when 1Ghz machines were current and his assertions are even sillier.
These days, it’s the automatic features on these programs that can slow down your system.
If a machine of the speed he’s talking about can’t keep up with a typist, even with the various background checkers turned on, it’s seriously broken.
Well, then, a 366 Celeron running XP is going to fly.
Only if you throw it of a space shuttle. My 700Mhz celeron in nowhere close to acceptable running XP, it is powered off and has been for more than a year.