10ZenMonkeys has interviewed Steve Wozniak. When asked about Bill Gates, he replied: “I’ve only spoken with him briefly a couple of times. I admire him, he admires me. Good lord, I’d never written a computer language when he had written a BASIC in the early days of hobby computers. And I thought, ‘Oh my gosh – a computer with BASIC finally makes a computer that people can use for things’.”
Many credit Woz as being the father of personal computing. And while he is a brilliant man and a decent human being, he really does not deserve that title. I think Chuck Peddle, designer of the MOS Technology 6502, deserves that title. If it wasn’t for the 6502 the Apple II would have been delayed or would have cost more since the price of the equivalent MC6800 was drastically higher than that of the 6502. I am not sure why Peddle’s name is so often overlooked by those who document computer history.
The 6502 is a processor, not a personal computer.
I don’t think gdanko said it was.
Perhaps, but that also invalidates his claim that Peddle should be called the “father of personal computing”…I do think that Woz and Jobs are more deserving – as much as anyone can receive that kind of credit – because they used the technology available but assembled it and packaged it in a way that made it accessible to everyone (who had the money for it – the Apple II was great, but it was quite expensive).
Yes, the 6502 series was designed and intended for “embedded systems”, therefore a maximum of 256 bytes stack space, no ready-made 16-bit instructions etc, was considered appropriate… Also, it’s Zilog not Zylog
Don’t take this the wrong way; 6502 certainly was elegant simplicity, using very few transistors (65c02 has more however).
The 6502 was a work of genius. It was created by a clutch of disgruntled engineers after quitting Motorola. It is a processor made by engineers, for engineers; without the corporate bullcrappen interfering, as it was at Motorola.
Anybody who has programmed with it, knows this. It is beautifully designed, logical- yet fun, free of over-workmanship.
The proof is simply that people still code for it today on Commodore 64’s, pushing the chip that little bit further each time, even 32 years later.
Edited 2007-07-06 21:26
Unfortunately, as these things go, the title of ‘mother’ or ‘father’ of technology often goes not to the first to do it, but the first to foster it, promote it until it worked.
I am slightly biased here (the Woz has been a personal hero of mine since I was about 10), but I disagree. The shift from do it yourself kits to pre made boards was something that wouldn’t have happened for quite awhile longer if it hadn’t been for the Steves. And while the 6502 definitely played a part, it was the Woz’s downright genius at computer design that let the Apple ship at the price it did. I really can’t link to all the various bits of Woz lore I have read over the years, but he was doing things back then that noone had even begun to think of (like device drivers vs circuit board controllers).
I think it is important to distinguish between the original inventor of an item and the first one to make an item popular. Far too often, the original inventor gets little or no credit.
Here is a photo of the Xerox Alto which first appeared in 1973: http://toastytech.com/guis/altosystem.jpg
It sure looks like a PC to me. Maybe the box is a little bigger than typical, and the price might have been prohibitive. However, in light of the 1973 Alto, how can anyone maintain that Steve Wozniak/Jobs was the “inventor” of the computer that has a monitor, a keyboard and a box that sits under/on-top-of a desk?
In addition, the 1973 Alto used a GUI! — Steve Jobs and Steve Wozniak didn’t offer a GUI until ten years later. And, incidentally, the Alto GUI had icons, a three-button mouse, floating, hierarchal menus (that were identical to drop-down menus except for their screen position), etc. Here are a couple of screenshots:
http://toastytech.com/guis/altorainbow.jpg
http://www.digibarn.com/collections/software/alto/alto-cedar-enviro…
I tend to favor the original inventor over the profiteers.
They weren’t, such computers existed long before as soldering kits. What they did is made a product that was usable for the average joe, and got the general public using something that didnt exist out of big business and universities.
It is often implied that apple ripped off Xerox, they didn’t. They liscenced the ideas after being shown some of the prototype stuff xerox was working on at the time.
And comparing the alto to the mac os is apples and oranges. It was the work of one of the fathers of Usability, Jeff Raskin, and later the Tog (Bruce Tognazini) that made the UI such a joy to use. They did not invent the desktop metaphor, but they did invent the billion and one features that set Mac OS apart from it.
As for the three button mouse, Jeff explained why he went one button in The Humane Interface. Back then, the mouse was a foreign interface that noone knew how to use. The Alto had two buttons, one for select, the other to activate. Jeff figured that introducing such a radical new way of interfacing with the computer was going to take alot of adjustment for people anyways, and wanted to simply the device so that it could be more easily manipulated. Thus, select became single click, activate became double click.
Looking back, he says that it was a mistake. First off, double clicks probably made the mouse more difficult to use the two buttons would have. He says his main problem with two buttons was people getting mixed up with which did what, but now he would have just labeled them. Sure, the label would have worn off, but by the time it did the user would be proficient.
There are reasons behind the changes, it wasnt an intentional step backwards. And there were plenty of things in the design that was flat out crazy. Like device drivers for the disk drive, or intentionally using artifacts in the display to reduce the required size of the framebuffer for the amount of colors shown.
They weren’t, such computers existed long before as soldering kits. What they did is made a product that was usable for the average joe, and got the general public using something that didnt exist out of big business and universities.
Okay. So we agree that neither Steve Jobs nor Steve Wozniak invented the personal computer.
However, we seem to disagree on the form in which earlier personal computers were offered. Undoubtedly, there were “Heathkit-like” sets around in 1975, but I am pretty sure that the Alto of 1973 was not offered as a “soldering kit.” Also, the 1973 Alto had a GUI, so it was designed to be used by the “average Joe.”
By the way, here is an early Heathkit computer: http://www.heathkit-museum.com/computers/hvmec-1.shtml
It is often implied that apple ripped off Xerox, they didn’t. They liscenced the ideas after being shown some of the prototype stuff xerox was working on at the time.
In regards to my original point, Xerox developed the GUI long before Apple, and, thus, Xerox should get credit in the eyes of the world. It is another matter as to whether Apple licensed the GUI or ripped it off.
And comparing the alto to the mac os is apples and oranges. It was the work of one of the fathers of Usability, Jeff Raskin, and later the Tog (Bruce Tognazini) that made the UI such a joy to use. They did not invent the desktop metaphor, but they did invent the billion and one features that set Mac OS apart from it.
These assertions are subjective.
In regards to usabiltiy and operation, I see very few significant developments in computer GUIs since the early 1980s (pre-Lisa and pre-Mac).
Jef Raskin’s usability contributions involved only computer GUIs, so I would not call him one of the “fathers of usability.” Likewise, with Bruce Tognazini, although I don’t recall anything particularly important coming from him. A better candidate for such a general usability title would be someone like Donald Norman.
Please be more specific as to what Raskin and/or Tognazini did “that made the GUI such a joy to use,” and please be more specific about a few of “the billion and one features that set Mac OS apart from” the “desktop metaphor.”
I can think of several usability problems with Apple software and hardware.
As for the three button mouse, Jeff explained why he went one button in The Humane Interface. Back then, the mouse was a foreign interface that noone knew how to use. The Alto had two buttons, one for select, the other to activate. Jeff figured that introducing such a radical new way of interfacing with the computer was going to take alot of adjustment for people anyways, and wanted to simply the device so that it could be more easily manipulated. Thus, select became single click, activate became double click.
This is just one in long line of examples of Apple making bad design decisions, because they think they know what is best for the end user. They over-think things to the detriment of the user. A good designer field tests for usability before making design commitments. Of course, a mouse with two or three buttons is better than a one-button mouse, as many Mac users are just now learning.
And, by the way, the mouse was not a “radical new way of interfacing” — it was invented at least 15 years earlier in 1963 by Dr. Doug Englebart: http://www.afrlhorizons.com/Briefs/Mar02/OSR0103.htm
In addition, the Alto mouse had three buttons, not two: http://www.netclique.net/oldmouse/mouse/xerox/alto.shtml
He says his main problem with two buttons was people getting mixed up with which did what, but now he would have just labeled them. Sure, the label would have worn off, but by the time it did the user would be proficient.
A “father of usability” would use a label on mouse buttons?
There are reasons behind the changes, it wasnt an intentional step backwards. And there were plenty of things in the design that was flat out crazy. Like device drivers for the disk drive, or intentionally using artifacts in the display to reduce the required size of the framebuffer for the amount of colors shown.
I do not understand to what you are referring.
Edited 2007-07-08 21:14
Steve Wozniak certainly didnt invent the PC as a concept. Miniaturization had been going on for a number of years with the transistor and the silicon chip, so it was only a matter of time before someone would produce a computer that could sit on a desk, however I do give credit to Woz for his technical ability. The Apple II was one of the first home computers that a regular person could buy and use without needing to solder on their own keyboard and transformer (I also thought it was the first colour PC as well, though I might be wrong on that).
The Xerox Alto was a revolutionary machine, and something of a landmark in computing, however it was never for sale to the general public, and there was only a limited number made (something like 50), so I dont think it can be classed as a PC as such, even though it has many properties of a modern PC.
From what I understand, the parts alone cost something like $25,000, and the gui wasn’t complete. Some of it was “proof of concept” and a number of the applications were still launched by the command line.
Steve Wozniak certainly didnt invent the PC as a concept.
And Steve Wozniak definitely did not invent the PC as a physical, operational, ready-made, complete device — Xerox already had such a device several years before Wozniak/Apple.
Miniaturization had been going on for a number of years with the transistor and the silicon chip, so it was only a matter of time before someone would produce a computer that could sit on a desk.
Agreed. And the complete computer that could sit on the desk had been around years before Wozniak/Apple were making computers.
… however I do give credit to Woz for his technical ability.
No doubt, Wozniak was a master of computer electronics.
The Apple II was one of the first home computers that a regular person could buy and use without needing to solder on their own keyboard and transformer (I also thought it was the first colour PC as well, though I might be wrong on that).
The first computer that had a monitor, keyboard, a small box and a GUI with a mouse was unveiled by Xerox in 1973. The Apple II was first shown in 1977, and it lacked a GUI and a mouse.
The Xerox Alto was a revolutionary machine, and something of a landmark in computing…
Interesting. The Alto was “revolutionary,” but only “something” of a landmark?
…however it was never for sale to the general public, and there was only a limited number made (something like 50), so I dont think it can be classed as a PC as such, even though it has many properties of a modern PC.
So, if Steve Wozniak had created a single G5 Mac in 1964 and kept it in his closet until now, the G5 in his closet could not be considered a PC?
How an invention is offered/sold/marketed has nothing to do with the nature of the device. The Xerox Alto had all of the basic properties of the later, mass-marketed PCs, with the added bonus of a GUI! The Alto was a PC, far ahead of its time.
From what I understand, the parts alone cost something like $25,000, and the gui wasn’t complete. Some of it was “proof of concept” and a number of the applications were still launched by the command line.
So what? The Alto was still an advanced PC, regardless of its price. “Proof of concept” proves that something works, and, thus, the “proof of concept” parts were already invented — long before Wozniak/Apple.
IF the Alto required some command-line interaction, nonetheless, the Alto was the first to demonstrate hierarchal menus, icons, floating windows, etc. — many years before Apple even had a GUI.
Edited 2007-07-09 16:52
How an invention is offered/sold/marketed has nothing to do with the nature of the device. The Xerox Alto had all of the basic properties of the later, mass-marketed PCs, with the added bonus of a GUI!
Good point, but I dont see the Alto as an “invention” as such, and if we’re going to be absolute then the PDP-8 could be considered a Personal computer (minus the gui), or the Imlac PDS-1. They were contained in relatively small cabinets, could be used by a single person, and were both around before the Alto. Hell there was probably some obscure computing device from the 1950’s that could be described as “personal”, but its not what I’d consider to be a PC.
I have enormous respect for the Alto, but I cant consider it as a “personal computer” in the way that the PET/Apple II/TRS-80 etc. were.
Yes it had a mouse, gui, ethernet and various other technologies, but it was a high concept mini-computer used in a research center, rather than a personal device that could be used in the home. I wouldnt even say that it initiated the home computer market as such.
Kits like the Altair and Mark-8 were what kickstarted the hobbyist movement which grew into todays industry.
Perhaps my definition of PC is somewhat different to yours, but to me it implies affordable and accessible to the home user, otherwise its another tool that remains in the preserve of universities/governments/corporations etc.
Edited 2007-07-10 15:54
…I dont see the Alto as an “invention” as such,…
The Alto was not the first computer, but it is probably the first computer to use a true GUI, which would make it a significant invention.
My point throughout this thread has been that credit that should go to the original inventors is usually taken by those who make the invention popular. Steve Wozniak/Jobs may have sold a lot of kits and Apple IIs, but they weren’t even close to being the inventors of the small computer with a keyboard and monitor that could sit on/under a desk. Undeniably, Xerox had that years before Apple, and the Xerox Alto also had a modern GUI.
…and if we’re going to be absolute then the PDP-8 could be considered a Personal computer (minus the gui), or the Imlac PDS-1. They were contained in relatively small cabinets, could be used by a single person, and were both around before the Alto.
Agreed. And all of this progress came many years before Apple computer.
However, the Alto seems to be the only one of the three that used a CPU that was small enough to fit under a typical desk, and it was the only one to use a modern GUI with icons, hierarchal menus, and floating windows.
Hell there was probably some obscure computing device from the 1950’s that could be described as “personal”, but its not what I’d consider to be a PC.
A Univac could have been personal, but I think that most will agree that what constitutes a personal computer is a keyboard, a monitor and a small CPU box that can fit on-top-of/under a desk (and these elements can be separate or combined).
I have enormous respect for the Alto, but I cant consider it as a “personal computer” in the way that the PET/Apple II/TRS-80 etc. were. Yes it had a mouse, gui, ethernet and various other technologies, but it was a high concept mini-computer used in a research center, rather than a personal device that could be used in the home.
I agree that the 1973 Alto was advanced, far ahead of anything Jobs/Wozniak offered until 1983.
However, how it was regarded and marketed in the early 1970’s has no bearing on what it actually is. Do the users of this Alto seem like research scientists in a laboratory formulating “high-concepts?”: http://toastytech.com/guis/altokids.jpg
I wouldnt even say that it initiated the home computer market as such. Kits like the Altair and Mark-8 were what kickstarted the hobbyist movement which grew into todays industry.
Again, how a device is marketed/offered/sold (and its success or lack of success) has no bearing on the true nature of the device. The Altair and Mark-8 kits may have furthered the computer hobbyist movement, but the fact is that the Xerox Alto had a monitor, a keyboard, a small CPU box and an advanced GUI long before the existence of those kits and long before Apple computer.
Perhaps my definition of PC is somewhat different to yours, but to me it implies affordable and accessible to the home user, otherwise its another tool that remains in the preserve of universities/governments/corporations etc.
What is affordable and accessible? It sounds as if your definition of a PC would exclude Steve Job’s NeXT computer — those units were prohibitively expensive and used mainly by institutions/corporations.
Edited 2007-07-10 18:14
My point throughout this thread has been that credit that should go to the original inventors is usually taken by those who make the invention popular. Steve Wozniak/Jobs may have sold a lot of kits and Apple IIs, but they weren’t even close to being the inventors of the small computer with a keyboard and monitor that could sit on/under a desk. Undeniably, Xerox had that years before Apple, and the Xerox Alto also had a modern GUI.
Oh I absolutely understand that view, there’s many examples in the industry where success is bestowed on the person who exploits the technology, rather than the person who created it. (For the record, I was a Commodore kid and cut my teeth on a Vic-20. I dont regard the Apple II as something that was dropped from the heavens by Saint Woz, however I do have a lot of respect for it as one of the first microcomputers that came pre-assembled for the home market.)
However, the Alto seems to be the only one of the three that used a CPU that was small enough to fit under a typical desk, and it was the only one to use a modern GUI with icons, hierarchal menus, and floating windows…Again, how a device is marketed/offered/sold (and its success or lack of success) has no bearing on the true nature of the device.
If we were talking about the jet engine, I’d agree with you. If we were talking about a Commodore 64 that was built in 1927 I’d probably agree too, but I think this is where I differ on the Alto. I find its in more of a grey area technologically. It may be one of the best examples of an early small computer, but it wasn’t the first. If I was to accept the Alto as a PC, I’d have to accept the PDP-8 as well, and somehow that doesn’t feel accurate.
Technically, the gui isn’t relevant as to whether the Alto is a PC or not, as many PC’s didnt have graphic interfaces in the early days, and even today there’s enthusiasts who favour the command line.
If the Alto had been the size of a warehouse, then there’s no doubt it couldnt be classed as a PC because of its enormous size, regardless of how sophisticated its interface was.
The fact it was contained in a single cabinet is remarkable, and makes it a contender for the title, but I think defining a PC is about more than how many components can be shoved into a small cubic space. As a matter of fact, I think how its shipped to market is important to consider because any large company with sufficient funds could have poured money into making a one-off desktop computer for the sake of it.
Any car company can pour money into making a one off vehicle with 1000bhp, however the Bugatti Veyron is the only production vehicle with that kind of power (as opposed to some dragster with a rocket on the back).
What is affordable and accessible? It sounds as if your definition of a PC would exclude Steve Job’s NeXT computer — those units were prohibitively expensive and used mainly by institutions/corporations
Well yes actually, I think of them as workstations with an advanced OS. I know that physically speaking they were PC’s that were containted in a box under the monitor, but they were exclusive to professionals and academics. (Though I’m sure they had the capability of being used by children to paint pictures with the right software)
The problem is that my definition of “Personal computer” almost intrinsically includes “Home computer”, which is something the Alto missed out on because of bad management decisions at Xerox.
And while the 6502 definitely played a part, it was the Woz’s downright genius at computer design that let the Apple ship at the price it did.
Re-read your history. If memory serves me, he PET shipped for about a third of the price of the Apple II and could be used out of the box. Where the Apple II gave you a blank stare unless you bought a disk drive etc etc. This isn’t to say the Apple was a bad computer… it was not. However, the 1541 was a much better drive than Apple’s offering. The 1541 was crippled in speed because of a stupid mistake that made it to the assembly line. Didn’t someone cut the wrong trace or something? Anyway, if you had two 1541s copying a disk you could unplug them from the computer and the copy would continue unhindered because the 1541 actually had a CPU of its own.
Commodore brought affordable computers to the masses.. Apple did not. I am a Mac user to this day but their prices have never really been what I considered competitive.
Woz is a brilliant engineer. His alacrity is only surpassed by his pranks.
The PET existed before the Apple II and was much easier to use. The original Apple II was slightly easier to use than the KIM-1. So Peddle and his PET still beat Woz and his Apple II to the punch.
Actually, the PET and the Apple II both came out in 1977, with the Apple II being available in June and the PET only being sold in September of that year.
Both were predated by the KIM-1, the Apple I and (my favorite) the Altair 8800. Admittedly, these were not quite PCs yet…
Ah, sweet memories…
Actually, both the Apple ][ or the Pet came after the Apple I. The Apple I was built in early 76 or maybe late 75 (around then)… The Altair was the inspiration however…
In his book iWoz, Woz acknowledges the Altair and others, although it turns out he’d built a computer even before that, and was even featured in a local newspaper article at the time. Sadly, the guy doing the interview stepped on it and Woz couldn’t be bothered fixing it, he didn’t see much use for such a low powered device at the time I believe.
What Woz says is his was the first with a qwerty keyboard and monitor.
Before the Apple I, computers like the Altair only had switches or punch cards to enter data. Larger machines had keyboard and monitor of course…
I consider it a moral right-ness. I don’t know how to speak for everybody in society about necessities. But I think it’s very honorable and it’s very good for the customers.
I like that quote. Some people who think free software is more moral manage to really turn people off by speaking an absolutes. Woz does a nice job of presenting his opinion and explaining why he has it without demonizing those who wouldn’t agree. Tact is a lovely thing.
[ot]Also, always nice to see my alma mater get a mention.[/ot]
“””
Some people who think free software is more moral manage to really turn people off by speaking an absolutes. Woz does a nice job of presenting his opinion and explaining why he has it without demonizing those who wouldn’t agree. Tact is a lovely thing.
“””
I absolutely agree. Advocacy is as much about the avoidance of making enemies as about persuading people. Just as it is generally easier to destroy than to create, it is easier for detractors to score points than it is for an advocate to score a convert.
Remember that every enemy created through thoughtless, bad advocacy, is as fully capable of persuading people to his side as the advocate is to persuade people to his. And once he has created more than one detractor, he is outnumbered.
That doesn’t mean that it’s bad to feel strongly about a cause that one believes in. It just means that it is most effective not to come off as a self-righteous, fanatical pest.
Free Software would be in a stronger position today if it had not been for the “help” of its most “enthusiastic” supporters.
Edited 2007-07-07 13:52
Lets not forget laser printing, Ethernet, the modern graphical user interface (GUI), and object-oriented programming were invented by Xerox PARC.
Apple didn’t steal the GUI, in fact, Xerox got a large amount of Apple stock.
Edited 2007-07-07 02:47
Somehow the fact that Apple gave Xerox a large portions of Apple stock is always overlooked, unlike what Microsoft did. Xerox was going to dump the research anyways, the company was so bureaucratic (anti-engineer). I feel bad for the engineers at Xerox whose work was never appreciated. Almost never does a good deed go unpunished.