Linked by Hadrien Grasland on Thu 20th Jan 2011 21:21 UTC, submitted by poundsmack
Hardware, Embedded Systems "We've seen IBM and ARM team up before, but this week both companies announced a new joint initiative to develop 14nm chip processing technology. That's significantly smaller than the 20nm SoC technology ARM hopes to create in partnership with TSMC, and makes the company's previous work with IBM on 32nm semiconductors look like a cake walk."
Order by: Score:
blow against atom?
by bnolsen on Fri 21st Jan 2011 02:27 UTC
bnolsen
Member since:
2006-01-06

To get around the huge overhead of an x86 instruction decoder Intel's really only chance against ARM was their manufacturing. I have to hand it to ARM, in the past few years they've made important decisive business decisions. They shortened their release cycle, enabled today's dominant smartphones and now have an excellent fab partner.

Reply Score: 5

Yes, but Intel striked back
by pica on Fri 21st Jan 2011 09:43 UTC in reply to "blow against atom?"
pica Member since:
2005-07-10

Look at

http://appdeveloper.intel.com/en-us/

Intel has introduced a developer program especially for its Atom processors. This program has a million Dollar fund to pay^H^H^H pursuade developers to develop for Atom.

The consumer side looks like this

http://www.appup.com/applications/index

pica

PS I prefer ARM

Edited 2011-01-21 09:56 UTC

Reply Score: 2

Now, you know why...
by _QJ_ on Fri 21st Jan 2011 11:04 UTC
_QJ_
Member since:
2009-03-12

Remember the IBM logo of 17 nanometers long ?
... 35 Xenon atoms on nickel.

Now, you know why IBM has invested so much in nano-technology researches.

Less power, more power

Reply Score: 1

And the point is...
by vodoomoth on Fri 21st Jan 2011 12:55 UTC
vodoomoth
Member since:
2010-03-30

... what exactly?
I'm curious to know what applications such small sizes enable that the current technologies don't.

I hope such smaller chips will provide a better experience than the ludicrous two-hour battery life of usage we get out of smartphones these days.

I also wonder why the "limit" that I heard about almost two decades ago as to the minimal size of chip technology hasn't been reached yet. It's like crude oil... in the 80's, people said there wasn't much left. Still, the last projections I've heard about still say there's enough for more than two decades. So, clueless experts or miraculous discoveries?

Reply Score: 2

RE: And the point is...
by Laurence on Fri 21st Jan 2011 13:13 UTC in reply to "And the point is..."
Laurence Member since:
2007-03-26

... what exactly?
I'm curious to know what applications such small sizes enable that the current technologies don't.

I hope such smaller chips will provide a better experience than the ludicrous two-hour battery life of usage we get out of smartphones these days.

I also wonder why the "limit" that I heard about almost two decades ago as to the minimal size of chip technology hasn't been reached yet. It's like crude oil... in the 80's, people said there wasn't much left. Still, the last projections I've heard about still say there's enough for more than two decades. So, clueless experts or miraculous discoveries?


The latter.

Advances in physics and CPU design have made it possible to work around previously perceived theoretical limits.
However, that doesn't mean that there isn't a limit. Eventually we will have to switch to an entirely new type of CPU. But for the immediate future, there's still life left in silicon-based transistor processors.

Reply Score: 3

RE: And the point is...
by acobar on Fri 21st Jan 2011 15:15 UTC in reply to "And the point is..."
acobar Member since:
2005-11-15

If memory serve me well, the main problems people used to point were:
- Leakage;
- heat dissipation;
- quantum effects.

The first two were tackled by new materials that had better insulation properties or would produce less heat. If you look at processors today, they also tend to operate at smaller switching frequencies.

The later problem is not solved yet. It is related to statistics and quantum behavior (we want the outcome of a operation with same input to come every time with same results). As it is a "physical barrier" (even though it was greatly exaggerated at that time, I guess) we may need a new approach, computing model or live with that (like we do with thermal machines now).

Reply Score: 3

RE: And the point is...
by Neolander on Fri 21st Jan 2011 16:44 UTC in reply to "And the point is..."
Neolander Member since:
2010-03-08

... what exactly?

Well, it's cool to make transistors that are only a few dozens of atoms wide, and I find it fun to see the last years of Moore's law before it finally hits the wall. Apart from that, same thing as before...

I'm curious to know what applications such small sizes enable that the current technologies don't.

There will be a slight increase in processing power during some time, which sadly will quickly be compensated by a big increase in software bloat.

I hope such smaller chips will provide a better experience than the ludicrous two-hour battery life of usage we get out of smartphones these days.

Probably not. Hardware evolves, but the biggest killer in terms of power management is software and engineering decisions. When phones have 4" screens with a capacitive touch layer, and run Java and .Net software on top of feature-bloated kernels and computationally heavy UI layers, there's not much which power-efficient hardware and more efficient batteries can do for you ;)

I also wonder why the "limit" that I heard about almost two decades ago as to the minimal size of chip technology hasn't been reached yet. It's like crude oil... in the 80's, people said there wasn't much left. Still, the last projections I've heard about still say there's enough for more than two decades. So, clueless experts or miraculous discoveries?

As someone else said, we're reaching a limit, but slower than we initially thought.

After that, we'll be able to make some even tinier transistors through a qualitative change of chip processing technology, like by using carbon nanotubes or even single molecules in transistors, but at some point we'll reach the quantum limit where our transistors won't even be able to compute things with a good probability of being right.

Then there are several paths :
-> Quantum computers (A very, very long path. Recently, an optical quantum computer has managed to compute 15 = 3x5, and there was much rejoicing)
-> More CPU cores, and bigger chips since we can't make them smaller, to the point where we are forced to ditch the concept of a central RAM altogether and have several independent mini-computers inside our computer. Distributed operating systems taking over the world.
-> Optimizing even further some operations by using other physical phenomena than we usually do. As an example, it's possible to compute large-scale 2D Fourier transforms at the speed of light (literally) by using analog optical components.
-> Writing lean and efficient software... Okay, I'm dreaming there.

Edited 2011-01-21 17:02 UTC

Reply Score: 2

RE[2]: And the point is...
by acobar on Fri 21st Jan 2011 17:21 UTC in reply to "RE: And the point is..."
acobar Member since:
2005-11-15

We forgot to add, processors are operating at lower voltages also, what helps lower the heat created.

Nice post, would like to mod you up but, unluckily, this stupid rule "can not give feedback once you post" is still kicking here. Thorn, may you please change that? If you want a solution for abuse, use the "one moderation per post per user" rule. I think everyone would be more than happy with that.

Reply Score: 3

RE: And the point is...
by Evan on Sun 23rd Jan 2011 09:29 UTC in reply to "And the point is..."
Evan Member since:
2006-01-18

Cost.

Smaller die sizes reduce cost by more than half. You get more dies per wafer, which means more processors/hour.

In addition to performance, power, etc.

Reply Score: 1