Linked by fran on Tue 23rd Nov 2010 22:26 UTC
Hardware, Embedded Systems The CPU industy is working on 16nm chips to debut by around 2013, but how much smaller can it go? According to the smart guys, not much smaller, stating that at 11nm they hit a problem relating to a 'quanting tunneling' phenomena. So what's next? Yes, they can still add core after core, but this might reach a plato by around 2020. AMD's CTO predicts the 'core wars' will subside by 2020 (there seems to be life left in adding cores as Intel demonstrated a few days ago, the feasibility of a 1000 core processor.) A Silicon.com feature discusses some potential technologies that can enhance or supersede silicon.
Permalink for comment 450974
To read all comments associated with this story, please click here.
RE[3]: quantum tunneling
by onetwo on Wed 24th Nov 2010 08:18 UTC in reply to "RE[2]: quantum tunneling"
onetwo
Member since:
2009-01-22

@thavith_osn: As a human kind we only have "leaky" transistors. We never had any other kinds. Thus, leakage has never been solved: it is due to the manufacturing process on one side and on the other side it is due to the our understanding of quantum phenomena (not quanting phenomena, might I add, thus stating my original intent of pointing the wikipedia article, which in it's own right is rather poorly written). The problem now is exacerbated by the power/heat you input/output for the retention you get/spread of information-entropy per bit per unit time. For example, for the next generation of FLASH cells you need 1 electron/years retention rate. I could have put 1 year, 2 year, 10 years: with any modern VLSI it still sounds, let's say, a jot absurd.

One more thing worth noting: quantum phenomena are observed even at the 130nm node, there however, one just doesn't care. However these phenomena do not just magically disappear at different scales. Quantum phenomena are in daily life, when one boils eggs or when tries to walk through walls. Quantum phenomena, however, become just improbable as in the latter case.

@Neolander: I agree with you. But you have to ask yourself: what is the "speed limit" of processors. I hope you are referring to a current technological "speed limit". But even so. Why wouldn't the same "lean" engineering be applied to hardware engineering thus alleviating the "stagnation"?

Thus I arrive at my point (at last). My view is that the bifurcation of the computational science demanded by the commodity-driven industry is the problem; sometimes it is less observable, sometimes it is more. I should note, however, that it is a natural bifurcation, an evolutionary one. It is a necessity stemming from the conceptualization of the creative process: language, grammar conceptualization per unit time for the successful creation of ye working thing that could be purchased. It is far easy to do it on "a sheet of paper" rather than on couple of million of transistors specially designed for a purpose. M?

But none-the-less, it is an evolutionary process, our technological advance. It cannot stagnate. It only stagnates when it is anthropomorphized in the context of "global economy".

Reply Parent Score: 1