Linked by Thom Holwerda on Fri 10th Apr 2009 19:44 UTC
IBM "Moore's Law is maxing out. This is an oft-made prediction in the computer industry. The latest to chime in is an IBM fellow, according to a report. Intel co-founder Gordon Moore predicted in 1965 that the number of transistors on a microprocessor would double approximately every two years - a prediction that has proved to be remarkably resilient. But IBM Fellow Carl Anderson, who researches server computer design at IBM, claims the end of the era of Moore's Law is nigh, according to a report in EE Times. Exponential growth in every industry eventually has to come to an end, according Anderson, who cited railroads and speed increases in the aircraft industry, the report said.
Order by: Score:
:)
by poundsmack on Fri 10th Apr 2009 19:58 UTC
poundsmack
Member since:
2005-07-13

I was so tempted to go and find every link over the last 6 years where Intel/IBM said "oh its ending now" only to have some breakthrough and have it keep going ;)

it will end when it ends, and when it does we either need to start making programs that utilize what we have more efficiently (though processors are so powerfull these days that is allows coders to be lazy) or we need to switch to quantum computing/laser HDD's/_______(insert future tech here)

Reply Score: 3

Scotty
by Moredhas on Fri 10th Apr 2009 21:28 UTC
Moredhas
Member since:
2008-04-10

It sounds an awful lot like "She canno' take no more capn'! She's breaking up!"

Reply Score: 5

Atoms don't scale.
by JoeBuck on Fri 10th Apr 2009 21:51 UTC
JoeBuck
Member since:
2006-01-11

In a way, Moore's Law as it had been practiced for about 30 years ended in 2003. Up until that time a relatively simple formula was used to resize the features, the doping, and the voltages in a consistent way, that allowed old designs to be easily cloned and made faster, smaller, lower power, better in every way. But when the formula got to the point where transistor gates were a few atoms thick, leakage power went through the roof, and scaling changed. You could still make feature sizes smaller and get more on the chip, so we still had Moore's Law in one sense. But you had to either let power go through the roof or back off on speed, and more and more complex tricks had to be used to keep scaling going. And state-of-the-art fabs cost billions (yes, with a B) to build.

That's why the processor vendors are selling multicores. No matter that we don't really know how to program 1000-way parallelism, that's what we're getting. And even with the new tricks, atoms don't scale. Sure, we can go 3D, but how are we going to cool a device like that?

We'll see a couple more doublings, maybe more, but they will be much harder to achieve, and unless there's really a market for the resulting chips, who's going to bother?

Reply Score: 6

I've never understood...
by thavith_osn on Fri 10th Apr 2009 23:35 UTC
thavith_osn
Member since:
2005-07-11

Moore's Law.

I mean, why they called it a law. A law is something that is real and proven. A theory is something unproven. I always thought it should be Moore's Theory (or Moore's Observation)...

I'm sure it was a little tongue in cheek, but anyway...

Reply Score: 5

RE: I've never understood...
by PlatformAgnostic on Sat 11th Apr 2009 02:35 UTC in reply to "I've never understood..."
PlatformAgnostic Member since:
2006-01-02

Your distinction between Law and Theory is not true. A theory is a systematic explanation of something which may have a body of evidence confirming it (if it's an accepted theory it pretty much must have a solid grounding in evidence.. a rejected or half-baked theory has incorrect evidence or no evidence whatsoever).

There is no such thing as a 'law' in scientific discourse, so the word 'law' is really a qualitative label that people put on theories or ideas which they think are solid. Or the word law can be used in another sense as more of a rhetorical device. In this case, Moore's Law is more of a rule-of-thumb than an actual theory even.

From everything I've read and the interactions I've had with professors in the microelectronics field, designing a commercial chip is as much art as a science. Given the competition in the field and the high degree of specialization of the various people involved in producing a chip, everyone looks for rules of thumb, industry consensus, generic simplified models, etc, to help figure out what to build, how to build it, and what design techniques are likely to give good results.

I don't remember who said this, but one of the senior members of a chip company compared building a chip to Russian Roulette: "When you start building a chip, you pull the trigger... you find out five years down the line whether you've blown your head off."

Reply Score: 5

RE: I've never understood...
by Pawel Ciupak on Sat 11th Apr 2009 10:41 UTC in reply to "I've never understood..."
Pawel Ciupak Member since:
2009-04-04

Hm, I think, that you’re wrong. Something, that is unproven is a hypothesis, not theory. So, so called „Moore’s law” should really be „Moore’s hypothesis” ;P.

Reply Score: 1

RE: I've never understood...
by Alleister on Sat 11th Apr 2009 17:03 UTC in reply to "I've never understood..."
Alleister Member since:
2006-05-29

Newtons law of gravity doesn't deal with relativity and thus scales beyond speed of light.
That doesn't make it worthless or incorrect, it just looses precision beyond a certain point.

Law isn't a defined term in science and you are mixing up Hypothesis and Theory.

Reply Score: 2

RE: I've never understood...
by zaine_ridling on Tue 14th Apr 2009 21:52 UTC in reply to "I've never understood..."
zaine_ridling Member since:
2007-05-13

Disagree. Scientific theories are facts supported by mountains of evidence, sometimes over centuries (evolution, for example). To say something is a theory -- in science -- means it's waiting to be disproven and no one has done it yet.

Reply Score: 1

Just a rule of thumb
by chemical_scum on Sat 11th Apr 2009 04:39 UTC
chemical_scum
Member since:
2005-11-02

Yes it should really be Moore's rule of thumb.

Reply Score: 3

RE: Just a rule of thumb
by OSGuy on Sat 11th Apr 2009 05:47 UTC in reply to "Just a rule of thumb"
OSGuy Member since:
2006-01-01

How about Moore's Opinion? ;) ;)

j/k I think Moore's Thoery is the right wording.

Edited 2009-04-11 05:47 UTC

Reply Score: 1

RE[2]: Just a rule of thumb
by bosco_bearbank on Sat 11th Apr 2009 10:37 UTC in reply to "RE: Just a rule of thumb"
bosco_bearbank Member since:
2005-10-12

Once upon a time, one of my professors characterized a law as a hypothesis/theory/SWAG expressed as a mathematical statement. As such, Moore's Law is a valid term.

Reply Score: 1

MysterMask
Member since:
2005-07-12
Didn't we already lose it?
by 3rdalbum on Sat 11th Apr 2009 11:03 UTC
3rdalbum
Member since:
2008-05-26

I thought we lost it some time between the later Pentium 4s and Core? It seemed to me like it took Intel a long long time to scale their P4s from 3Ghz to 3.6GHz - a mere 20% improvement, with a resulting increase in heat.

Reply Score: 1

RE: Didn't we already lose it?
by abraxas on Mon 13th Apr 2009 11:19 UTC in reply to "Didn't we already lose it?"
abraxas Member since:
2005-07-07

I thought we lost it some time between the later Pentium 4s and Core? It seemed to me like it took Intel a long long time to scale their P4s from 3Ghz to 3.6GHz - a mere 20% improvement, with a resulting increase in heat.


It really has nothing directly to do with processor speed. Doubling transistors just usually meant more speed. Transistors are still being doubled approximately every two years. We are just getting more cores now instead of higher speeds. This doesn't negate Moore's Law as it says nothing about speed.

Reply Score: 2

chip guy
by transputer_guy on Sat 11th Apr 2009 17:00 UTC
transputer_guy
Member since:
2005-07-08

As an actual chip guy, I have lived with the law (and retired from it too) so I'd say it is fair to call it a law although it is no more a law than many other uses of the word. If we can have Murphy's nonsense laws, and Wirth's or Reiser's law (the software reverse of Moore's law), then we are really observing something that always seems to be quite true up to a point.

Even in English Law, their are many bad uses of the term too, many petty acts which have ridiculous consequences are still on the books from olde tymes!

It is worth reading up on the Nehalem chip to see what Intel has really changed in the design of the core 2 architecture internals to the i7. These relate not so much about processor architecture but more to do with logic and circuit design. The older P4 and core families used a domino type of circuit design that would obstinately use more power than necessary and did not produce the expected speed improvements mainly due to the need to have an extra Vt worth of power supply to keep SRAM working with 6 transistors. The entire industry has used 6t ram cells since the earliest days with discrete bistable circuits. Intel had the guts to change that to 8 transistors and that allows one Vt to be cut from the power supply across the board and now domino circuits give way to pure old time symmetric cmos circuits. While these are not faster, they do scale much better and can use much less power. The consequence of that is that we are now mostly on a massively parallel processor path and much less on speed path. As long as Intel can sell higher level multi cores, Moore's law is in effect as long as Intel can scale the processes.

Reply Score: 2

RE: chip guy
by gustl on Sat 11th Apr 2009 20:51 UTC in reply to "chip guy"
gustl Member since:
2006-01-19

I totally agree with you.

GUIs and the likes usually can be handled in one core easily, and the number-crunching stuff needs to go multi-core as fast as possible.

But that is in itself a problem. How many algorithms really scale well with adding cores to it?
Graphics, yes, but matrix inversion is a problem to get good scaling for.

Reply Score: 2

RE: chip guy
by PlatformAgnostic on Sun 12th Apr 2009 07:58 UTC in reply to "chip guy"
PlatformAgnostic Member since:
2006-01-02

I'd like to read more about this. Would you kindly point me to any papers in public or in the journals about Intel's transition away from dynamic circuits?

Reply Score: 2

RE[2]: chip guy
by transputer_guy on Sun 12th Apr 2009 08:23 UTC in reply to "RE: chip guy"
transputer_guy Member since:
2005-07-08

Well thats easy, I lost the link, but google around for Nehalem, domino, 8t, static cmos, etc. I recall reading the article on Toms Hardware when i7 first came out.

Myself, I was in the market to build a quad core OSX box and the sales idiot in Microcenter kept on pushing to buy the latest and greatest Intel i7, saying it was 3x faster etc etc. An an engineer I don't like that kind of dumb oversell but was curious to know why Intel had to change the whole thing again. The core processor architecture doesn't change much, but they finally caught up with AMD on memory interface onboard. The new design methodology changes the game to better support more low power cores on chip and using more parallel wide logic than deeper faster logic. It is like a return from quasi nmos-cmos right back to real cmos. I believe AMD was also more of the wider slower does more logic than the fewer racier circuits that Intel going back to the early Athlons so Intel and AMD concur again.

Anyway OSX runs fine on regular quad core and not ready for i7.

Reply Score: 2