Linked by Thom Holwerda on Mon 20th Feb 2012 22:53 UTC
Hardware, Embedded Systems "A group of researchers has fabricated a single-atom transistor by introducing one phosphorous atom into a silicon lattice. Through the use of a scanning tunnelling microscope and hydrogen-resist lithography, Martin Fuechsle et al. placed the phosphorous atom precisely between very thin silicon leads, allowing them to measure its electrical behavior. The results show clearly that we can read both the quantum transitions within the phosphorous atom and its transistor behavior. No smaller solid-state devices are possible, so systems of this type reveal the limit of Moore's law - the prediction about the miniaturization of technology - while pointing toward solid-state quantum computing devices."
Order by: Score:
next electrons?
by fran on Mon 20th Feb 2012 23:05 UTC
fran
Member since:
2010-08-06

Could you ever do a transister by manipulating the magnetic fields inside the atoms.
You cant predict where it is (Heisenberg) but can estimate where it is.
Now the atom is not part of the switch, but the transister itself.
One relative pole is 0, shift to other is 1

Reply Score: 2

RE: next electrons?
by Browser Insider on Tue 21st Feb 2012 17:10 UTC in reply to "next electrons?"
Browser Insider Member since:
2009-06-16

Now we need to figure out how to break appart an atom :-)

Reply Score: 1

RE[2]: next electrons?
by fran on Tue 21st Feb 2012 17:26 UTC in reply to "RE: next electrons?"
fran Member since:
2010-08-06

Let's do the patent first. That is the easiest part.
I saw some drawings of patents lately. It's easy.

We draw some stickmen, a circle and some arrows.
Ought to fly through the patents office.

Reply Score: 5

v more doom and gloom..
by Brunis on Mon 20th Feb 2012 23:15 UTC
Sub-atomic?
by CapEnt on Mon 20th Feb 2012 23:23 UTC
CapEnt
Member since:
2005-12-18

Well... we can improve this with sub-atomic particles!

Speaking seriously, it will superb when we could mass manufacture these things. And live enough to see the security hell that this will create too, since almost all your current most popular cryptography algorithms can be broken (at least in theory) by quantum computers.

Reply Score: 7

RE: Sub-atomic?
by Gullible Jones on Tue 21st Feb 2012 00:02 UTC in reply to "Sub-atomic?"
Gullible Jones Member since:
2006-05-23

I suspect most governments would declare martial law as soon as they got word of quantum computation in the wild, and probably attempt to imprison or execute anyone who they thought was involved.

(Governments do not like it when the citizens spy on them. What we call whistleblowing, they call treason.)

Reply Score: 6

RE[2]: Sub-atomic?
by fran on Tue 21st Feb 2012 00:06 UTC in reply to "RE: Sub-atomic?"
fran Member since:
2010-08-06

Just set your passwords at 1000 character/letter/symbol combination. You'll be fine then.

Reply Score: 2

RE: Sub-atomic?
by satsujinka on Tue 21st Feb 2012 00:07 UTC in reply to "Sub-atomic?"
satsujinka Member since:
2010-03-11

err... Actually no. Quantum Computers are NP complete for the same problems as digital computers are. So aside from maybe being faster, quantum computers don't have any advantage.

Reply Score: 1

RE[2]: Sub-atomic?
by AnyoneEB on Tue 21st Feb 2012 00:27 UTC in reply to "RE: Sub-atomic?"
AnyoneEB Member since:
2008-10-26

Except all of our currently used public key cryptography algorithms can be broken by a quantum computer. See https://en.wikipedia.org/wiki/Post-quantum_cryptography . It turns out that NP-complete problems seem to be only hard in the worst case and easy in the average case, so no one knows how to use them for cryptography. Grover's algorithm does allow for faster solutions to NP-complete problems, but they remain exponential.

With a sufficiently powerful quantum computer, Shor's algorithm defeats RSA in polynomial time and a generalization of it can solve the discrete logarithm problem (DSA, Diffie–Hellman, ElGamal) in polynomial time.

Reply Score: 3

RE: Sub-atomic?
by abraxas on Tue 21st Feb 2012 02:45 UTC in reply to "Sub-atomic?"
abraxas Member since:
2005-07-07

Well... we can improve this with sub-atomic particles!

Speaking seriously, it will superb when we could mass manufacture these things. And live enough to see the security hell that this will create too, since almost all your current most popular cryptography algorithms can be broken (at least in theory) by quantum computers.


There is a simple solution once you have quantum computers...quantum encryption.

Reply Score: 2

RE[2]: Sub-atomic?
by looncraz on Tue 21st Feb 2012 09:08 UTC in reply to "RE: Sub-atomic?"
looncraz Member since:
2005-07-24

The situation is even easier: quantum tagging / degradation

Once the message is read, it is destroyed. And, with a low probability of natural failure, would mean that a single transmitted secure-mode packet would signal a security breach and a new secure protocol would be enacted.

Snooping would disrupt communications of secure data, but the information itself would be largely secure (save for the odd packet here or there...).

--The loon

Reply Score: 2

RE[3]: Sub-atomic?
by Alfman on Tue 21st Feb 2012 15:40 UTC in reply to "RE[2]: Sub-atomic?"
Alfman Member since:
2011-01-28

looncraz,

It's true, while quantum computing closes the door on conventional encryption, it opens another door for quantum encryption. But unfortunately quantum encryption is not a direct substitute for PKI, leaving us to revert to point to point security. Without a CA, quantum entangled material would need to be exchanged and managed between parties beforehand (think every devices*website, wholly unrealistic). Or the traffic would have to be routed through a trusted proxy which has a secure quantum encrypted channel to both parties and is responsible for securing the traffic between them (more likely, but less ideal).

Also, quantum encryption suffers from the same bootstrapping issues as conventional encryption. You may have gotten a secure "quantum encryption card" in the mail, but without an out of band mechanism to validate it's authenticity (traditionally PKI), it's vulnerable to man in the middle attacks.

User <-> Service // normal secure quantum tunnel

User <-> Attacker <-> Service // attacker impersonates user to service, and impersonates service to user by swapping the quantum atoms and proceeds to mount a conventional man in the middle attack.

To address this using quantum encryption, one would probably need a third party which is already secure to validate that the bits the user is transferring match those seen by the service. This test would need to be done at the beginning of every session. However all this introduces more complexity and new failure modes because there's no quantum equivalent to PKI's offline authentication.

I'm just learning about quantum encryption so let me know if there are any errors in my understanding. ;)

Reply Score: 3

RE: Sub-atomic?
by FunkyELF on Tue 21st Feb 2012 15:48 UTC in reply to "Sub-atomic?"
FunkyELF Member since:
2006-07-26

Don't confuse this with a quantum computer those have yet to be realized (if ever).

All this proves is that we could possibly make traditional computing devices at this scale.

Reply Score: 2

RE[2]: Sub-atomic?
by Alfman on Tue 21st Feb 2012 20:19 UTC in reply to "RE: Sub-atomic?"
Alfman Member since:
2011-01-28

FunkyELF,

"Don't confuse this with a quantum computer those have yet to be realized (if ever)."

You are right, it's possible that they won't pan out. I suppose even if we're thinking of conventional transistors though, this development of single atom transistors might make future computers fast enough to render today's cryptographic systems vulnerable to brute force attacks. Does anyone know just how fast this single atom transistor is compared to those in 32nm CPUs? That would give us a much better idea of just how future-proof current cryptographic schemes are.

Algorithms like RSA and elliptic curves can naturally be extended to any bit length desired (although most implementations have an upper limit of 4096 bits).

Unfortunately, most symmetric ciphers are only defined for limited block sizes as a matter of standardization and offer no standardized way to extend them. For example, AES is hard coded to 128bit blocks, with key sizes limited to 256bits. Most of the time making the key size larger is trivial even if non-standard, and one could easily project the new parameters. For example, the number of rounds for AES-128 is 11, AES-192 is 13, AES-256 is 15, so we might project AES-384 to be 17 rounds and AES-512 to be 19 rounds, but the fact that it's non-standard is a problem. Unfortunately increasing the AES block size from 128bit would require a whole new algorithm since it's integral to the AES function.

Mind you this is all just fun cryptographic theory, I don't see a very compelling need for larger AES key or block lengths today.


BLOWFISH, by comparison, supports key sizes up to 448bits, but has a smaller block size of 64bits. In my opinion, this small block size is too small for comfort. In theory, even without cracking the 448 bit key (which is unfathomable using conventional means), one might begin to map out 64bit blocks directly. Given 1GB of data, there's a 0.02% chance a block will be reused and become cryptographically weak. Each additional GB increases the odds of a collision: for 2GB,3GB,4GB,5GB it goes to 0.08%,0.19%,0.34%,0.53% respectively.

The information leaked may be rare and of little value, but it could facilitate a meet in the middle attack against the key function, which if successful would decrypt the entire stream. For this reason, I think AES is better despite it's smaller key size.

Reply Score: 2

Think outside the atomic box
by Luke McCarthy on Tue 21st Feb 2012 00:56 UTC
Luke McCarthy
Member since:
2005-07-06

Quark computing - you heard it here first!

Reply Score: 4

RE: Think outside the atomic box
by lordepox on Tue 21st Feb 2012 04:17 UTC in reply to "Think outside the atomic box"
lordepox Member since:
2010-04-14

I can see the book possibilities now, "Bosons for Bozos."

Reply Score: 2

RE[2]: Think outside the atomic box
by fran on Tue 21st Feb 2012 18:21 UTC in reply to "RE: Think outside the atomic box"
fran Member since:
2010-08-06

or quarks for quacks

Edited 2012-02-21 18:21 UTC

Reply Score: 2

RE: Think outside the atomic box
by Neolander on Tue 21st Feb 2012 14:57 UTC in reply to "Think outside the atomic box"
Neolander Member since:
2010-03-08

If it takes a particle accelerator to build that computer, I get power efficiency and portability won't be there though ;)

Reply Score: 2

RE[2]: Think outside the atomic box
by zima on Thu 23rd Feb 2012 08:37 UTC in reply to "RE: Think outside the atomic box"
zima Member since:
2005-07-06

Maybe such would be a computer distantly related to its ancient ancestors of Jupiter or matrioshka brains, but going all the way to being composed "from" (~"in") a quark star... I imagine, in such, power efficiency and portability would also have different scope ;)

Reply Score: 2

disappointment
by AlienSoldier on Tue 21st Feb 2012 04:16 UTC
AlienSoldier
Member since:
2010-02-26

It will be disappointing to learn this technology only happening in the iphone 85 while everybody hoped to see it in the 84s.

Reply Score: 3

Moore's law
by ndrw on Tue 21st Feb 2012 04:36 UTC
ndrw
Member since:
2009-06-30

Moore's law is not about technology, it's about economy of the IC process development. That's why the growth is exponential - better chips produce more demand, more demand brings more money, more money make better chips - a positive feedback loop. The rate of growth was limited only by engineering effort needed for solving a large number of non-critical issues.

As with any exponential growth it finally ends (it must end because neither physics nor total resources depend on money), and it ends rapidly. We don't even have to go down to the atomic scale - in many applications the exponential growth has already stopped.

The Moore's law, as it was originally formulated (density of transistors doubling every X months) is still valid in applications like memories or FPGAs (which are constrained by density and are easy to scale). But in CPUs and ASICs there is hardly any performance scaling anymore, people are now focusing on incrementally optimizing performance/power instead. That's because we are no longer able to scale down the supply voltage (transconductance of MOS transistors is fixed and mismatches are growing) as we did a decade ago. For now, we can work this around by making e.g. more CPU cores on the chip (so that we can utilize larger density of transistors) but that's no longer an _exponential_ growth. Performance doesn't scale proportionally to the number of cores (IO, software parallelism) and complexity and cost grow faster than it.

Reply Score: 6

Density or Size
by stestagg on Tue 21st Feb 2012 14:56 UTC
stestagg
Member since:
2006-06-03

A one atom transistor surrounded by 3m^2 of machinery isn't a very high transistor density. I think there's still scope for improvement here ;)

Reply Score: 2

RE: Density or Size
by Neolander on Tue 21st Feb 2012 15:02 UTC in reply to "Density or Size"
Neolander Member since:
2010-03-08

This is not necessarily a problem for some applications though, as long as they manage to pack a lot of stuff in that big box.

Case in point : the other day, I've heard about a breakthrough in quantum computers based on trapped ions. By using the near field of regular microwave antennas instead of stabilized lasers for cooling, they managed to pack ten times as much ions on a chip as they did before.

Now if they manage to have tons of qubits per chip and tons of chips per cryogenic vacuum chamber, they might get something that is worthwhile for HPC...

Edited 2012-02-21 15:03 UTC

Reply Score: 1

Comment by transputer_guy
by transputer_guy on Tue 21st Feb 2012 19:34 UTC
transputer_guy
Member since:
2005-07-08

This experiment was done at 0.020 Kelvin otherwise it wouldn't have worked.

I used to draw transistors for a living back when the smallest feature was 10,000 atoms wide. IBM and others have been pushing atoms around for maybe a decade or two writing their logos.

To be really interesting it would have to be a full blown logic circuit, at least an inverter chain, better still a small adder, or SRAM memory cell with the ability to read and write a few bits.

The interconnects will dominate though, the only thing that matters is how thin a wire can be drawn that will reliably work for years. In the picture it seems a 5 Si atom wide wire might work. The channel length is effectively 40 atoms wide.

I'd suspect logic circuits that look familiar might still work with 10nm or 100 atom features. Current 30nm is only 300 helium atoms across.

Reply Score: 2

We can (well I can't)
by jefro on Tue 21st Feb 2012 20:38 UTC
jefro
Member since:
2007-04-13

There will be some invention that uses strings or smaller particles or finds some way to use 3 D computing or some other improvement.

Reply Score: 0

RE: We can (well I can't)
by zima on Thu 23rd Feb 2012 08:51 UTC in reply to "We can (well I can't)"
zima Member since:
2005-07-06

Strings are not particles sort of by definition; and there wouldn't be "smaller" beyond elementary ones. 3D chips aren't about miniaturisation (plus I believe they have the usual, if not more severe, power dissipation issues)

But yeah, we can(tm) - look how we're finally, after over 2k years, on the verge of getting around Archimedes' Law!
Hm, or maybe not.

Reply Score: 2

RE[2]: We can (well I can't)
by Alfman on Thu 23rd Feb 2012 15:02 UTC in reply to "RE: We can (well I can't)"
Alfman Member since:
2011-01-28

zima,

"Strings are not particles sort of by definition; and there wouldn't be 'smaller' beyond elementary ones."

The theory, which is a clever interpretation of the statistical data we have available, could never the less be wrong. But yes it seems the OP jumped to the conclusion that strings could be "reprogrammed"; that might invalidate the premise that they are already programmed as they are to explain the universal laws of physics. Changing them would in affect create a different universe.

"3D chips aren't about miniaturisation (plus I believe they have the usual, if not more severe, power dissipation issues)"


3D chips will undoubtedly offer tremendous gains, yes the heat dissipation is a bummer. But what about superconductors?

Or here's another idea for thermal computers...
(no idea about the plausibility of such a thing though)

http://spectrum.ieee.org/biomedical/devices/thermal-transistor-the-...

Reply Score: 2

RE[3]: We can (well I can't)
by zima on Mon 27th Feb 2012 22:17 UTC in reply to "RE[2]: We can (well I can't)"
zima Member since:
2005-07-06

The Standard Model will NOT be "wrong" - it works too well for that, is too useful (also http://chem.tufts.edu/AnswersInScience/RelativityofWrong.htm ).
OTOH its faults are beyond "could never the less" - we KNOW it's basically interim (I won't use the word "wrong") ...mass of neutrinos, not explaining that which forms most of our universe (dark matter and energy), gravity (at odds with general relativity overall), or apparent absence of antimatter, and so on.

Post above it was just about mixing concepts; how particles are distinct from the idea of strings, a sort of expression of them. And an example of elementary (how people often directly name them in such wishes) particles being the limit, since that's how they are defined (the goalpost might move of course, it happened few times; but not the definition).
Mildly frustrating, people finding some catchwords and throwing them around; or naively extrapolating rates of progress (scientific method and such did give us the capability to unravel and exploit the world in more swift fashion than was the case throughout most of our existence - hence also made us realize hard limits; and tech plateaus - short spurts of progress are actually rather typical), worse if it leads to cargo cults of sorts.


"3D chips will undoubtedly offer tremendous gains" ...maybe, maybe not - we will see.
Superconductors - what about them? We don't know if high-temp ones are feasible, of the types adequate here (and maybe even just not practical, maybe properties of some other necessary chip components getting in the way; maybe, say, power dissipation of interconnects not being that much of a problem; anyway, do we really want terminators walking around? ;) )

Edited 2012-02-27 22:21 UTC

Reply Score: 2