The doctrine that computing power doubles every 18 to 24 months has been considered gospel for the past three decades. Now it may be time for a new look.
The doctrine that computing power doubles every 18 to 24 months has been considered gospel for the past three decades. Now it may be time for a new look.
Does this guy even know what Moore’s law is? He claims:
“The doctrine that computing power doubles every 18 to 24 months has been considered gospel for the past three decades.”
Moore’s original observation was that transistor count doubles every 12 to 24 months, yet somehow this 18 number keeps popping up. To quote Moore on the matter:
“I never said 18 months. I said one year, and then two years”
The article itself seems to run around in circles on issues which don’t really relate to Moore’s law. However, the research paper linked to in the article is interesting. It includes this graph of transistor counts of Intel processors, and mentions that the rates of transistor counts doubling isn’t quite what Moore had originally proposed:
http://firstmonday.org/issues/issue7_11/tuomi/figure3.gif
To quote the paper:
“During the first decade of microprocessors, the doubling rate was approximately 22 months but also very irregular. After the introduction of the 80386 processor family, the doubling speed was closer to 33 months.”
Nowadays most seem to agree that transistor count doubles approximately every two years.
A more interesting take on the end of Moore’s law comes from Intel chairman Andy Grove in this article:
http://www.theinquirer.net/?article=6677
The major problems lie in power dissipation and leakage. For example, the Itanium 2 has a power dissipation of 130 watts. This seems to fit the type of processor Grove talks of, with a power leakage of 40%, or 60 or 70 watts. (the Itanium 2 seems a bit less than this)
It seems in the future that with power leakage skyrocketing, we won’t see transistor counts continuing to double, but that instead processor designers will have to make more efficient use of transistors instead to continue increasing processor power.
I thought the article hit the nail right on the head . It doesn’t matter that the interpretation of Moore’e law is not always consistant. What is true is that Intel has led cpus down a path where marketing benchmarks run 2x faster every 18 months or so, but MS/PCs takes half of that away. My 1G Athlon is certainly much faster than my 1st QL or Beeb, but it isn’t any more responsive when I enter text. It just does 100x more work if I drag windows around.
In addition memory chips esp DRAM was once faster than cpus back in the 70s, is now 2 or more decades slower.
Example
1MHz 6502 v 3GHz P4, or 4K DRAM at 600ns cycle v 256M 80ns cycle, or Hard disk access times of 30ms is now maybe 5ms. Ofcourse the increase in data size has followed Moore’s law very nicely indeed, thankyou.
For real computing needs those drag factors cut Moores law to the bone. In Bioinformatice the computing requirment have been shown to be increasing at triple Moores law, ie DNA data is doubling every 6 months. Waiting for Intel is no longer going to cut it. The answer is to perform computing with direct HW if and when possible.
end of rant
not so much in shrinking die process as it is in improvements in technologies like strained silicon and silicon on insulator, and larger wafers (300mm)
amd is going SOI
intel is going Strained Si (SS)
both are implementing low-k dialetic
it would be cool to see a 90nm process with both SOI and SS, and both SOI and SS can itself improve. the combination would be equivalent to a full process (or even 2 process) generations
way back in 2000, there was a lot of talk about
in addition to copper (i wonder if we will need silver as silver is clearly a superior conductor) SOI SS, low-k
there was talk of isotopically pure silicon, as being signifcantly cooler operations, and that amd would buy pure isotope 28 silicon from isotonics. supposedly this would result in a 10-20% heat reduction.
what ever happened?
No, soi & strained silicon or any other buzwords picked up in EET etc will have only minor impact on improvements. What larger wafers do is to increase relative yield and give lower overall cost/die but it also pushes most of the semiconductor companies out of the fab business and drives up fab costs at Moore’s law rate.
If you believe that Intel will eventually give you a 10GHz Px, also believe that they will be the only fab left in town at that level. The soi, ss, low K stuff is the price each competitor must pay to stay in the race to ever shrink.
The no of ASIC design starts has been falling quickly recently, each ASIC requires more & more EEs to design & check it to death before fabbing. Each mistake costs $M in mask costs (damn Moore’s law again), & many ASICs only get 6 wafers made for that design.
AMD & Intel are in a pointless speed race pushing cpus so far beyond the rest of the PC HW thats it is becoming silly.
It would be real cool IMHO if Intel/AMD would start putting a good amount of FPGA onto die instead of extreme complexity. I could program that FPGA to do my critical loops far more effectively than tring to fine tune asm code. The ability to include FPGA would actually equal several generational skips ie in some cases 10-100x speed ups.
end of rant
It is not a law. A law is a theorem or postulate
that has been proven mathematically or through
controlled experimentation. It should be called a
theory, or better yet, a goal. CPU companies keep
trying to make this goal year after year.
Don’t get me wrong, trying to keep up this goal is
a good thing… calling it a law just irritates those
in the scientific community.
Well ofcourse your right, Moore’s law can’t be a law in the scientific sense but science & engineering often lose control of word meanings in the media. So what about Murphy’s laws, all of them are just humourous observations.
I don’t think that the industry is consciously trying to track Moore’s law at all. After all the rate has varied dramatically over the last 30yrs. Before PCs were common, the pace was almost snail like and less obvious. If AMD drops out, Intel will slow down.
Before giving it the term law there should be some kind of theory to suppot it. What is the so-called moore’s based on, numbers I guess. Let me give an example of my law. Be patient here, 2+2=4, 2^2=4, 2*2=4, therefore n+n=n^n=n*n. thats my law and it makes sense the same way that moore’s law does. btw was mr moore working for intel.
Well what about the Law in general, nothing scientific about that, just cultural stuff.
“btw was mr moore working for intel”
Gordon Moore & Bob Noyce were 2 of the founders of Intel coming out of the original Fairchild and before that Shockley’s Lab. Way way back at the beginning of semiconductor time.
I met Moore once at the microprocessor design conference in SJ, but so what.
I think everyone who posted there comments here know that moore was one of the founding members of intel. It’s tragedy that what he said is accepted as a law and even discussed in engineering colleges. Are we not reading and quoting this theory because it’s by some one famous, not because it makes any sense. Do you if I or you came with this theory and not moore it would be accepted and known as Vikram’s Law or JJ’s Law. For giving the a theory like moore’s law moore would be appreciated and I would be ridiculed. Think about it.
Vikram, you probably won’t see this but, I think Moores law was never intended to be taken so seriously. It was just an observation made at a time in the early 70s IIRC when things seemed to be following a particular pattern, er a straight line out on log scale. At 1st people were actually surprised & wondered how far it would go, but this sort of thing happens all the time. Later on it became a yard stick that everyone thought they should measure themselves by, to improve on what had gone before. If you could do better, you were likely to move to the front of the pack.
As for whether Moore should be credited, I think he deserves it. He was in the field when the industry was so tiny you could count the no of significant people & companies in your head. Now the industry is maybe 1000x bigger.
If you are early in any field and notice something about the universe & tell the world, you too will have your name immortalised.
just be early