Linked by Thom Holwerda on Mon 12th Jan 2015 20:15 UTC
Intel

It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.

Now, some 40 years after the debut of the Intel 8080 microprocessor, the industry can point to direct descendants of the chip that are astronomically more powerful. So what's in store for the next four decades?

Forty years old. The industry and technology sure have changed since then.

Order by: Score:
Comment by drcouzelis
by drcouzelis on Mon 12th Jan 2015 20:59 UTC
drcouzelis
Member since:
2010-01-11

Speaking of the Intel 8080 microprocessor...

Last year I read the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It was absolutely fascinating. The book begins over 100 years ago with two tin cans connected by a string and continues "upgrading" it until it's a modern microprocessor (including the history of the 8080). Computers went from being "a big box of magic" to something I can actually understand.

It was enlightening and an enjoyable read. I highly recommend it. ;)

Edited 2015-01-12 21:00 UTC

Reply Score: 7

And the Z80 lives on
by jockm on Mon 12th Jan 2015 22:09 UTC
jockm
Member since:
2012-12-22

The Z80, the successor — though from a different company — to the 8080 lives on. Zilog is still around and producing them, as well as any number of clones and second sources.

The 8080 isn't the only CPU that intel abandoned only for it to live on. The single most popular microcontroller in use today — the 8051 — came out of Intel.

Reply Score: 3

Doomsday scenario
by WorknMan on Tue 13th Jan 2015 05:00 UTC
WorknMan
Member since:
2005-11-13

"There's a lot of concern that we are developing the race that will replace us," adds Enderle, fears that have been articulated by scientists and others, from tech entrepreneur Elon Musk to renowned physicist Stephen Hawking. "We could create something so smart that it could think that it would be better off without us," Enderle adds. "It would see that we're not always rational and fix the problem, either by migrating off-planet as many hope, or by wiping out the human race."

Not everyone agrees with the doomsday scenario.


I don't necessarily see that as a doomsday scenario at all, but more of an extension of evolution. If we can't manage to get our shit together and stop destroying ourselves (and the planet as well) like a bunch of goddamn barbarians, perhaps the best we could do is to create successors who are smarter than we are. What a way to leave a legacy behind, eh? A legacy of sentient machines that could potentially unravel the mysteries of the universe, even if they have to put us out of our own misery first.

Reply Score: 3

RE: Doomsday scenario
by leech on Tue 13th Jan 2015 05:44 UTC in reply to "Doomsday scenario"
leech Member since:
2006-01-10

Well, so I'm kind of fond of the whole 'ancient astronaut' theory. So basically, if some beings from another planet/solar system colonized this earth, needed some workers, created us, were either overthrown by us, or killed off due to their own inner conflicts...

That leaves us to be destroyed by our own inner conflicts, except instead of creating some organic slaves, we create robotic ones, that end up overthrowing us.

So it's just "history" repeating itself, which happens all the time. ;)

Reply Score: 2

Comment by neticspace
by neticspace on Tue 13th Jan 2015 09:43 UTC
neticspace
Member since:
2009-06-09

I miss old-school microprocessors that work under a simple CPU design. Or is it just me?

Reply Score: 4

RE: Comment by neticspace
by drcouzelis on Tue 13th Jan 2015 13:07 UTC in reply to "Comment by neticspace"
drcouzelis Member since:
2010-01-11

The fact that modern processors can only be created and understood by using a computer gives me the willies.

Reply Score: 4

RE[2]: Comment by neticspace
by levi on Tue 13th Jan 2015 14:22 UTC in reply to "RE: Comment by neticspace"
levi Member since:
2006-09-07

That's nothing!

What is truly scary is that at some point computers will become too complicated for humans to program them. Guess what will be the solution to that ...

muhahaha, MUHAHAHAHAHA !

Reply Score: 3

RE[2]: Comment by neticspace
by tylerdurden on Tue 13th Jan 2015 17:20 UTC in reply to "RE: Comment by neticspace"
tylerdurden Member since:
2009-03-17

Then you may be on a constant state of getting the willies through the day; when you take a plane, when you get in a car, when you cross the street using a traffic light, when you buy something at the supermarket...

Reply Score: 0

RE: Comment by neticspace
by theTSF on Tue 13th Jan 2015 20:10 UTC in reply to "Comment by neticspace"
theTSF Member since:
2005-09-27

We do until we realize what things we now take for granted is far to complex for these to handle.

I had an 8086 PC. And these computers were simple enough to program rather easily because you know what is going on. No Complex OS to get in the way, drivers... Pfff. your hardware follows the standards or you need to write your own.
However a lot of things you probably would miss.

CGA. In Text Mode it wasn't too bad but 8 background colors and 16 foreground color. Text up to 80x25.
In Graphics mode you had 320x200 and 4 of the worst color combinations. Or you can go with 640x200 BW (Well 2 colors you can choose what two colors in the 16 color pallet to use.)

No Multi-Tasking. You run the program and thats it.

Now that CPU is slow. Even in text mode, refreshing is slow. You can see text draw itself. Processing data even if you get to a few thousand calculation will take a fair amount of time

640k memory barrier. Thats right 640k is all you will need for an application. 640k is actually a fair amount of data. However you need to be very careful until you hit the limits.

16bit. Numbers in Millions and billions means extra programming.

Reply Score: 3