Linked by fran on Tue 23rd Nov 2010 22:26 UTC
Hardware, Embedded Systems The CPU industy is working on 16nm chips to debut by around 2013, but how much smaller can it go? According to the smart guys, not much smaller, stating that at 11nm they hit a problem relating to a 'quanting tunneling' phenomena. So what's next? Yes, they can still add core after core, but this might reach a plato by around 2020. AMD's CTO predicts the 'core wars' will subside by 2020 (there seems to be life left in adding cores as Intel demonstrated a few days ago, the feasibility of a 1000 core processor.) A Silicon.com feature discusses some potential technologies that can enhance or supersede silicon.
Thread beginning with comment 450951
To read all comments associated with this story, please click here.
quantum tunneling
by onetwo on Wed 24th Nov 2010 00:42 UTC
onetwo
Member since:
2009-01-22

quantum tunneling. Not the best article, sure, but do change the body of your summary (http://en.wikipedia.org/wiki/Quantum_tunnelling).

Reply Score: 1

RE: quantum tunneling
by thavith_osn on Wed 24th Nov 2010 05:12 in reply to "quantum tunneling"
thavith_osn Member since:
2005-07-11

From what I gather, it seems electrons for instance can always pass through any barrier as the probability is always non-zero, interesting...

Also seems tunnelling only seems to effect barriers 3nm or less (other than the odd electron that might get through (in theory))...

I guess we will have leaky CPU's soon enough if we don't come up with something else (which I'm sure we will).

I remember reading in BYTE magazine years ago that the maximum speed of a CD would be 10x. I remember thinking then how fast that would be...

Reply Parent Score: 2

RE[2]: quantum tunneling
by Neolander on Wed 24th Nov 2010 06:41 in reply to "RE: quantum tunneling"
Neolander Member since:
2010-03-08

Well, this is not the same as the cd-rom issue, interestingly enough.

If you reach the maximal speed which a cd-rom drive can spin at, the fight is over. There's no way you can make a standardized optical data storage technology go any faster, you have to make a new optical storage medium with increased data storage density and thus reduce the need for a drive that spins fast. This new storage medium will be incompatible with existing drives, so its adoption rate will be quite slow.

With processors, on the other hand, once you've reached the speed limit of usual processors, all you have to do is to put several ones in the same chip. This way, you can reliabily claim that you have packed N times the usual processing power in that chip.

If normal people with tasks that don't scale well accross multicore chips start complaining that they don't get N times the performance, you can then blame the software developers for reaching the limits of the algorithmic way of thinking of the human mind.

Then, as N grows and CPUs can't shrink any further, buses will increasingly become the bottleneck of computing performance. Issues with the speed of light limit and congestion in memory access will be more and more serious. So the hardware manufacturers will adopt a decentralized memory model where cores don't even share memory with each other, basically becoming independent computers except for inter-core IO. The amount of software which can't scale well accross multiple cores will grow even further. HW manufacturers will still be able to claim that they've reached a higher theoretical performance and that SW manufacturers are to blame for not being able to reach it.

Unless we find a new way to reach higher performance in normal software (not server software in the perfect situation where tasks are CPU-bound), its performance will stagnate. We'll then have to learn again how to write lean code, or to design new programming models that work across the new hardware but imply totally different ways of thinking. Or we'll create new CPU architectures which allow higher performance without improving silicon technology, but will take years to be widely used.

One thing is for sure : for the next decade, improvements in the performance of usual software won't come from improvements in silicon technology. Actually, I think it's a good thing.

Reply Parent Score: 3