It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.
Now, some 40 years after the debut of the Intel 8080 microprocessor, the industry can point to direct descendants of the chip that are astronomically more powerful. So what’s in store for the next four decades?
Forty years old. The industry and technology sure have changed since then.
Speaking of the Intel 8080 microprocessor…
Last year I read the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It was absolutely fascinating. The book begins over 100 years ago with two tin cans connected by a string and continues “upgrading” it until it’s a modern microprocessor (including the history of the 8080). Computers went from being “a big box of magic” to something I can actually understand.
It was enlightening and an enjoyable read. I highly recommend it.
Edited 2015-01-12 21:00 UTC
The Z80, the successor — though from a different company — to the 8080 lives on. Zilog is still around and producing them, as well as any number of clones and second sources.
The 8080 isn’t the only CPU that intel abandoned only for it to live on. The single most popular microcontroller in use today — the 8051 — came out of Intel.
I don’t necessarily see that as a doomsday scenario at all, but more of an extension of evolution. If we can’t manage to get our shit together and stop destroying ourselves (and the planet as well) like a bunch of goddamn barbarians, perhaps the best we could do is to create successors who are smarter than we are. What a way to leave a legacy behind, eh? A legacy of sentient machines that could potentially unravel the mysteries of the universe, even if they have to put us out of our own misery first.
Well, so I’m kind of fond of the whole ‘ancient astronaut’ theory. So basically, if some beings from another planet/solar system colonized this earth, needed some workers, created us, were either overthrown by us, or killed off due to their own inner conflicts…
That leaves us to be destroyed by our own inner conflicts, except instead of creating some organic slaves, we create robotic ones, that end up overthrowing us.
So it’s just “history” repeating itself, which happens all the time.
I miss old-school microprocessors that work under a simple CPU design. Or is it just me?
The fact that modern processors can only be created and understood by using a computer gives me the willies.
That’s nothing!
What is truly scary is that at some point computers will become too complicated for humans to program them. Guess what will be the solution to that …
muhahaha, MUHAHAHAHAHA !
Then you may be on a constant state of getting the willies through the day; when you take a plane, when you get in a car, when you cross the street using a traffic light, when you buy something at the supermarket…
We do until we realize what things we now take for granted is far to complex for these to handle.
I had an 8086 PC. And these computers were simple enough to program rather easily because you know what is going on. No Complex OS to get in the way, drivers… Pfff. your hardware follows the standards or you need to write your own.
However a lot of things you probably would miss.
CGA. In Text Mode it wasn’t too bad but 8 background colors and 16 foreground color. Text up to 80×25.
In Graphics mode you had 320×200 and 4 of the worst color combinations. Or you can go with 640×200 BW (Well 2 colors you can choose what two colors in the 16 color pallet to use.)
No Multi-Tasking. You run the program and thats it.
Now that CPU is slow. Even in text mode, refreshing is slow. You can see text draw itself. Processing data even if you get to a few thousand calculation will take a fair amount of time
640k memory barrier. Thats right 640k is all you will need for an application. 640k is actually a fair amount of data. However you need to be very careful until you hit the limits.
16bit. Numbers in Millions and billions means extra programming.