I can remember seeing my first calculator in the 1970s. It was shown to us by a teacher in school. Up to then all we had seen were mechanical adding machines. The calculator amazed us, it was silent, instantaneous, and even had a square root key, a function I never saw any adding machine do. The teacher explained that soon every home would have a computer. I couldn’t believe it, computers were huge, and filled rooms. Even a home computer would take up a living room. He was right though, by 1977 we had home computers that weren’t much bigger than a keyboard.
The first “home computers” (Apple II, C64, TRS 80) usually had a command line interface, and came with the BASIC programming language. Programs came on a disk, for those who had one, or on a plug in cartridge. Those
first home computers were used for a variety of tasks: word processing, sound analysis, decoding weather satellite pictures, and were even turned into heart monitors. The BASIC programming language (an OS in a way) allowed the user to get their computer to do whatever they wanted. It seemed that there was nothing you couldn’t do with a computer, and users were constantly coming up with new ideas.
Then in 1983 Apple introduced the Lisa. The Lisa introduced the mouse, the GUI, icons, an office suite, the ability to run several programs at once. It also had true preemptive multitasking capability. Then in 1985 the Amiga was introduced, which introduced the concept of multimedia.
That was 1985, what about now? Is there anything truly new in operating systems these days? Before you say “of course!” consider that most of the features of Windows XP, Mac OS X could be found on the Lisa or Amiga,
way back in the mid 1980s. Yes, there have been continual changes, improvements, and updates, along with improvements in OS stability and graphics. Still, is there anything really new? Has there been a “quantum leap” like the home computers themselves or the GUI in the past 15 years? When you think of it, there really hasn’t. We’re still using the same desktop based GUI that the Lisa had in 1983. Yes, graphics have improved, but computers like the Amiga and Atari ST had multimedia and sound processing in 1985. And basic types of 3D games existed by1984.
Suppose you could get in a time machine and take one of today’s computers back to a computer user of 1985. What would they think? They likely would be impressed with the game and graphics capability, but I think they would be disappointed to find we’re still using the desktop based GUI, along with the keyboard and mouse. See, back in 1985 there was much talk about artificial intelligence, voice recognition, etc. The possibilities for future computers seemed endless. Given the progress from 1980 command line computers to 1985 graphical computers, I think most people would have expected there would have been much more progress in the past 17 years.
Part of the problem was mismanagement by the companies that made the Lisa/Mac, and the Amiga. Although these computers were advanced, they were never given the chance to become a market standard, mostly due to
the mistakes of the companies that made them. Also, they were ahead of there time. There were no Power Point presentations, and so businesses didn’t appreciate the graphics capability of the Mac and Amiga.
Although the Macintosh found a place in schools and businesses, internal problems and changes at Apple kept it from progressing much since 1985. Apple had planned to have a next generation OS (along the lines of OSX or BeOS) for the Mac by the early 90s, but numerous mistakes and problems, along with management changes, kept it from occurring. So Mac users had to wait until 2001 to see the new OS.
Secondly, with all the incompatible operating systems in existence, businesses were waiting for a standard. The IBM PC became the standard. The problem was that the IBM PC was introduced as a business machine, not a graphics machine. It was mostly a text machine, with limited graphics capability (there was a GUI for the IBM PC called GEM, made by Digital Research. It came out shortly after the Mac did, and basically provided a Mac like GUI for the PC. However, it never took off).
With the IBM PC and DOS becoming the standard by the late 1980s, much of the progress provided by the Lisa/Mac and Amiga were lost. The IBM PC needed to catch up. It wasn’t until OS/2 and Windows 95 that the PC
finally had a desktop GUI, multitasking, decent multimedia capabilities, etc. In other words, features that existed on the Lisa, Mac and/or Amiga 10 years before.
Still no progress?
So here it is 2002, and we’re still using operating systems based on 17 year old concepts (actually, the GUI, mouse, ethernet, etc. were developed at Xerox earlier yet). It seems like the computer industry has lost vision. Computers have become “business machines” and status quo prevails. Where is the imagination and vision of the 1980s? The possibilities seemed endless. Computers were being programmed to talk, and to communicate with plain english via text.
A few hopeful signs have appeared recently. There is work being done on a full fledged pocket sized computer. Add LCD glasses as your display and a compact or virtual keyboard, and we’ll have computers that go with us anywhere. Also, Microsoft has been working on a 3D style GUI that will update the desktop based GUI to the modern internet/email/multimedia world.
So, what should we have by now?
You might ask “well, what did do think we should have by now“? A good question. How about:
An improved GUI. The desktop metaphor GUI was developed when most people used their computers for office functions, – – writings letters, doing spreadsheets, ect. Much of the time users just used one program at a
time. The interenet and email were virtually unknown. Today that has all changed. It is typical to have several programs running at once. It is typical to be on the interent, checking emailing, and writing
a letter all at the same time. Several programs may be running, each with several documents open. The desktop GUI still works, but it needs an update.
You would think that by now multiple desktops would be standard. These allows users to have a desktop for each task group: games, office work, graphics work, ect. It would also be nice to have desktops with zoom
and pan capability. In fact, instead of being forced to do all the pointing and clicking, it would be nice if the GUI worked like a game. You could “travel” and zoom in and travel through folders and zoom in on documents to fill the screen.
How about a 3D analogy, where you “turn” to your right or left to access an internet panel or email/fax panel, ect? A 3D spatial layout could make it easier to keep track of many items, even those off screen. Window management could also be improved. It would be nice be able to rotate and angle windows to fit more on the screen at once.
I would have also expected that the OS and applications would be more seamless. Instead of OLE links between word and spreadsheet programs I would have expected by now that programs would “morph” into a word
processor or spreadsheet as needed. [Gobe Software does have an office application that does this, and Apple did try to introduce the concept as OpenDoc a few years ago.]
As far as command lines, I would have expected they would have gotten past the arcane and hard to remember commands. Computers should have enough AI capability to recognize plain english to a degree. I think by now I should be able to type “place folder called “bills” in the folder called “finances“, without typing the whole directory structure for each folder. And the computer should know how to do it. Or how about typing:
“change my internet access number to ……….” and not needing to go to a special panel to change the number?
There you have it. Several ideas that could have been implemented in Windows or the Mac OS years ago. A few are being considered only now. As I said, it seems like the problem is that the computer industry has
lost its visionaries. And maybe the “wonder factor”. We no longer see computers as tools that can do anything we want, but more as machines that do the standard jobs, word processing, graphics, email, ect.
Without the visionaries, there has been little progress.
About the Author:
Roger M. is an engineer who works in Southern California. He has owned computers since the time with 64K of memory was considered a lot of memory. He is interested in computers, cars, and pizza. Roger can be contacted at firstname.lastname@example.org