It depends on the continued use of a component called the “transistor” as the fundamental building block of processors. By the time two decades are out I’d find it unlikely that processors continue to use transistors at all given the theoretical potential of things like quantum computing, such as executing massive parallel operations by passing a single specially modulated nanolaser through a number of soliton-based quantum phase registers, for example.
Quantum computing is one option, also switching to optical gates and laser/optical switching could be another way to improve performance without the electron tunneling problem, bypassing the silicon transistor all together. Large leaps forward in computer processor power will come when the fundamental way that data is processed changes. Quantum and Optical are the two I’d bet on right now, but who knows, someone may come up with something completely new that will blow these both away, especially since these two systems are currently bulky and early in their development.
How much WASTE has Moore’s Law generated? Instead of writing good code that runs fast on existing machines, we have all sorts of scrubs writing crap code that requires new machines to work at a reasonable speed.
Microsoft has made a giant business out of writing code that is too crappy to run on the machine you have, Windows XP being the latest example of Microsoft bloatware.
And one could look at almost the entire Linux system and see how poorly it is written — and how damn slow most of it works, especially anything to do with graphics.
The amount of waste generated by the computer industry is vast. And it is nothing compared to how much damn electricity these new machines take to run every day. All because Intel needs to make more money by selling you a new chip every year — that takes even more power to run.
Moore’s Law is just Intel’s con game. Maybe if we didn’t power the planet using fossil fuel and leaky nuclear power plants, Moore’s Law would be more beneficial. But for today, the world we live in, Moore’s Law is nothing more than a curse.
“I’m postive that’s what they would have said 10 years ago.”
i agree with the original poster. The environmental and social impact of moore’s law outweighs the benefits. Moore’s law has a second side and that is encouraging a wasteful society where everything is replaced quickly. The waste from those semiconductor plants and disposal of old PCs is considerable the behavior of replace replace replace extends beyond pcs.
I think are at the tail end of the explosive growth phase of the computing revolution. It’s hard to see, since we’ve lived with Moore’s Law for so long, but computers just don’t need to go that much faster. We just don’t really need robots/voice-recognition/face-recognition/whatever badly enough to provide the vast funding that further major speed advancements would require. Sorry.
I used to work at a company that makes litho tools. Litho is an old idea that’s been squeezed and squeezed for all it’s worth over the years. Todays tools are so sensitive to heat/moisture/vibration/contaminants that it’s a wonder they work at all. The calcium flouride (for the UV lenses) is horrendously expensive. The alternative technologies (like “EUV”) are very far-fetched.
The article mentions:
Manufacturers will be able to produce chips on the 16-nanometer manufacturing process, expected by conservative estimates to arrive in 2018,
I really doubt they’ll even get that. Happily, I don’t think we really need it either.
I don’t understand why nore and more transitors are put in CPUs and GPUs – currently at about 100 mill. What are they all doing? Is it for firmware that otherwise would be software?
I think the trend is driven by an increasing demand for more RAM – also for online databases – with the speed increase, lesser power consumption and cheaper manufacturing as desirable and competition driven side effects.
I don’t understand why nore and more transitors are put in CPUs and GPUs – currently at about 100 mill. What are they all doing?
Mainly cache and branch prediction, with others used for more (complicated) execution units, (ridiculously long) decoding pipelines (sometimes two of them), etc.
If your code is one level worse than it could be, it takes anywhere from 2X to 10X hardware to get the system back to par.
The idea of making up in hardware what you do wrong in software is absurd. It is an idea put together by the two most powerful monopolies in computers — Microsoft and Intel. These two companies have forced an entire industry into a mad and crazy upgrade cycle, laying waste to all sorts of wonderful things that would have flourished in a non-monopoly open market.
Moore’s Law is just the computer version of stating the “Planned Obsolesence Law”. And this “law” is a curse, nothing more. Moore’s Law has cost trillions of dollars in waste and pollution. All so Microsoft and Intel can party with their monopolies.
For hardware to speed up bad code… you need massive hardware. This is why the modern CPU has so much cache and why the increased cache results in “smoothness” of operation. Windows, for instance, is purposely written to not work well on older hardware.
We used to have great software — lots of it — when people had to understand the machine to make it work. Now developers turn out mostly crap because they have no idea what the machine does or how it works. But mostly because they’ve been taught to be sloppy and careless with their code.
Moore’s Law is a tool used by amoral companies to achieve market dominance… through the generation of waste. It is a farce and it is not worthy of anything other than scorn.
This question is in itself useless. It is not about what we need, and it never was. It is about evolution — which is a force. We are building, and the basis of our current building is the fountation of the past. If anyone seriously believes that scientists and companies, all of a sudden, will go “Ah, this is where we want to be. We’ll stop now.”, you got a reality distortion. It is not about going into the 100Ghz range, or get 10000^100 Gb ram, it is about accomplishing new things.
Another person just mentioned calculating the weather, and the processing power needed in serious space travel (radiation prediction etc). For games, and entertainment, games wish for Maya class ingame graphics. Heck, back in the Amiga500 days, I wondered when games would have an equal graphics level to Vista Pro 3 — we are soon arriving at that level. Now we want something representing what World Construction Set can render.
It will always evolve, until a big hammer hits the earth. This is why people are speculating about the future. And this is why many people don’t speculate, as they can not imagine a world more “advanced” or more “complicated” than it already is.
I agree that crap software forces to upgrade and this is where MS and Intel want to go. Apple with 800Mhz or 1.25Mhz iMac has demostrated that speed is not everything when compared to the perceived responsiveness of the whole system.
I use MS,Linux,Apple. Apple seems the most responsive at the same Mhz/Ghz speed, with the same hardware (Video e HDD).
We need a fork because home users maybe are satisfied by what they have, reserach institutions need to simulate real worlds in silico, that’s where the demand for power comes from. Indeed it should be done writing good software from the start.
Just as my final thought, nobody is pushing anyone to upgrade or follow, it depends on what you expect from your PC or what you need to do. Browsing the Internet, Writing Docs does not need so much power, a PIII/500 with 256MB RAM is more than enough….
The oil will run out in however many years, and now they won’t be able to fit any more transistors on a chip in ten or twenty years. Well there is always the opportunity for parallel performance increase, to run more than one CPU processor and asynchonious busses. Actually if Moore’s law holds for another decade or two than we will have very fast systems. Somehow I’m not worried.
“By the time two decades are out I’d find it unlikely that processors continue to use transistors at all given the theoretical potential of things like quantum computing, such as executing massive parallel operations by passing a single specially modulated nanolaser through a number of soliton-based quantum phase registers, for example.”
As far as I understand, so far quantum computing is more of a solution looking for a problem. It makes sense only on some very narrow set of problems, for which it does indeed yield a huge gain.
With the PCI bus the system can transfer something like 150 mb per second to the screen, but with the ISA is was something like 5 or 10 mb per second, so therefore the bus has a lot to do with performance, not just the CPU. Now Intel designed the PCI bus and made it an open standard. If they really want or need more power after Moore’s law fails, than I’m sure there are other solutions and businesses are dirty, they will pull something out of the mud and sell it to you, just like always.
Moore’s law will be saved by quantum computing. Scientists have built devices that use quantum entanglement to perform calculations. Such a device would have a ratio of 1 to the second power.(correct me if I’m wrong, I read the article a few years ago in popsci) So two entangled particals would have the power of 4 transistors, and 4=16, 16=256,……….
if you don’t know what the %^#@^# I’m talking about,
i think there is a difference between extending moors law (a doubling of tranistors every 18 months) and continuing progress. You can progress at a slower pace.
the fundamental problem with moore’s law is that there is no economic value yet in maintaining it into the future. the additional power has become less necessary and the cost of providing those gains has become exponentially moore expensive.
The end of all this can’t come soon enough for me. But to continue to rant… I might as well copy and paste all of Goldstein’s comments here…
As a home for displaced cats and elderly computers… I’m eager to see an end to this grotesque cycle of waste and excuses.
Elderly computers… what is a computer’s lifespan compared to cats? Cats live about 15 to 20 years (rare). A computer? Two? Three? For people like me, maybe they can scrape by at the absolute maximum of 4 years (I would have made it to 4 had my board not crapped out on me three months ago). I wonder how old my brand new Asus P4C800-E Deluxe motherboard is in “Computer Years?” 36? Older?
Anyone ever been to a computer dump? At this school we get rid of monitors and computers every year. This year about 100 or more computers. The monitors usually are dead (costs more to repair than to replace) but the PCs are just fine. They only get replaced because of Windows and office suite “upgrades” that make the existing PCs seem like they’re elderly crippled morons.
What is the most disgusting about this industry is that these things are known facts. That doesn’t do anything to change things.
It depends on the continued use of a component called the “transistor” as the fundamental building block of processors. By the time two decades are out I’d find it unlikely that processors continue to use transistors at all given the theoretical potential of things like quantum computing, such as executing massive parallel operations by passing a single specially modulated nanolaser through a number of soliton-based quantum phase registers, for example.
quantum tunneling transistor.
http://www.sandia.gov/media/quantran.htm
or a spintronic transistor.
I like both of these ideas.
Spintronic seem to have the most advantages.
Quantum computing is one option, also switching to optical gates and laser/optical switching could be another way to improve performance without the electron tunneling problem, bypassing the silicon transistor all together. Large leaps forward in computer processor power will come when the fundamental way that data is processed changes. Quantum and Optical are the two I’d bet on right now, but who knows, someone may come up with something completely new that will blow these both away, especially since these two systems are currently bulky and early in their development.
How many GHz do you need? What about in 100 years from now? 1000? Can’t we be happy with what we have?
How much WASTE has Moore’s Law generated? Instead of writing good code that runs fast on existing machines, we have all sorts of scrubs writing crap code that requires new machines to work at a reasonable speed.
Microsoft has made a giant business out of writing code that is too crappy to run on the machine you have, Windows XP being the latest example of Microsoft bloatware.
And one could look at almost the entire Linux system and see how poorly it is written — and how damn slow most of it works, especially anything to do with graphics.
The amount of waste generated by the computer industry is vast. And it is nothing compared to how much damn electricity these new machines take to run every day. All because Intel needs to make more money by selling you a new chip every year — that takes even more power to run.
Moore’s Law is just Intel’s con game. Maybe if we didn’t power the planet using fossil fuel and leaky nuclear power plants, Moore’s Law would be more beneficial. But for today, the world we live in, Moore’s Law is nothing more than a curse.
3.2 GHz ought to be enough for anybody.
I’m postive that’s what they would have said 10 years ago.
“I’m postive that’s what they would have said 10 years ago.”
i agree with the original poster. The environmental and social impact of moore’s law outweighs the benefits. Moore’s law has a second side and that is encouraging a wasteful society where everything is replaced quickly. The waste from those semiconductor plants and disposal of old PCs is considerable the behavior of replace replace replace extends beyond pcs.
“64K is ought to be enough for anybody”
Well, for today applications – for sure.
You want smaller computers with more computational power…
You want faster machines to emulate wheather
You want faster machines for space exploration
You want ~100% accuracy in voice recognition
You want home robots…. (Matrix anyone…)
I mean, for today’s home applications – You are right.
For tomorrow – I’m not sure
Is there a law to predict increasing rates of predictions of Moore’s law dying? It seems to have been doing this for longer than BSD!
> “64K is ought to be enough for anybody”
It’s funny to still see folks use this one.
I think are at the tail end of the explosive growth phase of the computing revolution. It’s hard to see, since we’ve lived with Moore’s Law for so long, but computers just don’t need to go that much faster. We just don’t really need robots/voice-recognition/face-recognition/whatever badly enough to provide the vast funding that further major speed advancements would require. Sorry.
I used to work at a company that makes litho tools. Litho is an old idea that’s been squeezed and squeezed for all it’s worth over the years. Todays tools are so sensitive to heat/moisture/vibration/contaminants that it’s a wonder they work at all. The calcium flouride (for the UV lenses) is horrendously expensive. The alternative technologies (like “EUV”) are very far-fetched.
The article mentions:
Manufacturers will be able to produce chips on the 16-nanometer manufacturing process, expected by conservative estimates to arrive in 2018,
I really doubt they’ll even get that. Happily, I don’t think we really need it either.
I don’t understand why nore and more transitors are put in CPUs and GPUs – currently at about 100 mill. What are they all doing? Is it for firmware that otherwise would be software?
I think the trend is driven by an increasing demand for more RAM – also for online databases – with the speed increase, lesser power consumption and cheaper manufacturing as desirable and competition driven side effects.
I don’t understand why nore and more transitors are put in CPUs and GPUs – currently at about 100 mill. What are they all doing?
Mainly cache and branch prediction, with others used for more (complicated) execution units, (ridiculously long) decoding pipelines (sometimes two of them), etc.
If your code is one level worse than it could be, it takes anywhere from 2X to 10X hardware to get the system back to par.
The idea of making up in hardware what you do wrong in software is absurd. It is an idea put together by the two most powerful monopolies in computers — Microsoft and Intel. These two companies have forced an entire industry into a mad and crazy upgrade cycle, laying waste to all sorts of wonderful things that would have flourished in a non-monopoly open market.
Moore’s Law is just the computer version of stating the “Planned Obsolesence Law”. And this “law” is a curse, nothing more. Moore’s Law has cost trillions of dollars in waste and pollution. All so Microsoft and Intel can party with their monopolies.
For hardware to speed up bad code… you need massive hardware. This is why the modern CPU has so much cache and why the increased cache results in “smoothness” of operation. Windows, for instance, is purposely written to not work well on older hardware.
We used to have great software — lots of it — when people had to understand the machine to make it work. Now developers turn out mostly crap because they have no idea what the machine does or how it works. But mostly because they’ve been taught to be sloppy and careless with their code.
Moore’s Law is a tool used by amoral companies to achieve market dominance… through the generation of waste. It is a farce and it is not worthy of anything other than scorn.
This question is in itself useless. It is not about what we need, and it never was. It is about evolution — which is a force. We are building, and the basis of our current building is the fountation of the past. If anyone seriously believes that scientists and companies, all of a sudden, will go “Ah, this is where we want to be. We’ll stop now.”, you got a reality distortion. It is not about going into the 100Ghz range, or get 10000^100 Gb ram, it is about accomplishing new things.
Another person just mentioned calculating the weather, and the processing power needed in serious space travel (radiation prediction etc). For games, and entertainment, games wish for Maya class ingame graphics. Heck, back in the Amiga500 days, I wondered when games would have an equal graphics level to Vista Pro 3 — we are soon arriving at that level. Now we want something representing what World Construction Set can render.
It will always evolve, until a big hammer hits the earth. This is why people are speculating about the future. And this is why many people don’t speculate, as they can not imagine a world more “advanced” or more “complicated” than it already is.
Good posts from everyone,
but I’d like to add something.
I agree that crap software forces to upgrade and this is where MS and Intel want to go. Apple with 800Mhz or 1.25Mhz iMac has demostrated that speed is not everything when compared to the perceived responsiveness of the whole system.
I use MS,Linux,Apple. Apple seems the most responsive at the same Mhz/Ghz speed, with the same hardware (Video e HDD).
We need a fork because home users maybe are satisfied by what they have, reserach institutions need to simulate real worlds in silico, that’s where the demand for power comes from. Indeed it should be done writing good software from the start.
Just as my final thought, nobody is pushing anyone to upgrade or follow, it depends on what you expect from your PC or what you need to do. Browsing the Internet, Writing Docs does not need so much power, a PIII/500 with 256MB RAM is more than enough….
The oil will run out in however many years, and now they won’t be able to fit any more transistors on a chip in ten or twenty years. Well there is always the opportunity for parallel performance increase, to run more than one CPU processor and asynchonious busses. Actually if Moore’s law holds for another decade or two than we will have very fast systems. Somehow I’m not worried.
“By the time two decades are out I’d find it unlikely that processors continue to use transistors at all given the theoretical potential of things like quantum computing, such as executing massive parallel operations by passing a single specially modulated nanolaser through a number of soliton-based quantum phase registers, for example.”
As far as I understand, so far quantum computing is more of a solution looking for a problem. It makes sense only on some very narrow set of problems, for which it does indeed yield a huge gain.
With the PCI bus the system can transfer something like 150 mb per second to the screen, but with the ISA is was something like 5 or 10 mb per second, so therefore the bus has a lot to do with performance, not just the CPU. Now Intel designed the PCI bus and made it an open standard. If they really want or need more power after Moore’s law fails, than I’m sure there are other solutions and businesses are dirty, they will pull something out of the mud and sell it to you, just like always.
Moore’s law will be saved by quantum computing. Scientists have built devices that use quantum entanglement to perform calculations. Such a device would have a ratio of 1 to the second power.(correct me if I’m wrong, I read the article a few years ago in popsci) So two entangled particals would have the power of 4 transistors, and 4=16, 16=256,……….
if you don’t know what the %^#@^# I’m talking about,
http://plato.stanford.edu/entries/qt-entangle/
the graphics chip manufacturers have the correct method for
performance increases by processing more bits.
amd has moved on to 64 bits.
linux has moved on to 64 bits.
graphics cards have moved on to 256 bits.
the performance demanding public won’t wait for longhorn.
i think there is a difference between extending moors law (a doubling of tranistors every 18 months) and continuing progress. You can progress at a slower pace.
the fundamental problem with moore’s law is that there is no economic value yet in maintaining it into the future. the additional power has become less necessary and the cost of providing those gains has become exponentially moore expensive.
the death of moore’s law is a good thing.
The end of all this can’t come soon enough for me. But to continue to rant… I might as well copy and paste all of Goldstein’s comments here…
As a home for displaced cats and elderly computers… I’m eager to see an end to this grotesque cycle of waste and excuses.
Elderly computers… what is a computer’s lifespan compared to cats? Cats live about 15 to 20 years (rare). A computer? Two? Three? For people like me, maybe they can scrape by at the absolute maximum of 4 years (I would have made it to 4 had my board not crapped out on me three months ago). I wonder how old my brand new Asus P4C800-E Deluxe motherboard is in “Computer Years?” 36? Older?
Anyone ever been to a computer dump? At this school we get rid of monitors and computers every year. This year about 100 or more computers. The monitors usually are dead (costs more to repair than to replace) but the PCs are just fine. They only get replaced because of Windows and office suite “upgrades” that make the existing PCs seem like they’re elderly crippled morons.
What is the most disgusting about this industry is that these things are known facts. That doesn’t do anything to change things.