Intel scientists will announce on Thursday that they have built a prototype of a silicon chip that can switch light on and off like electricity, blurring the line between computing and communications and bringing sweeping changes to the way digital information and entertainment are delivered.
Well, it seems kind of hard to overestimate this, though the sweeping statements being made give it their best shot…
This seems to be the first definitive advance in optical computing. With this we’ll see processors with significantly lower power consumption, negligable heat dissipation, and an end to transistor leakage issues.
In this age of continuously ramped up clock speeds and processors which dissipate more heat than a 100W light bulb, it’s nice to see advances in fronts other than semiconductors.
The only way to buy processors is to wait until they are dirt cheap. These guys want to sell their vision. Let’s see the chips first.
This has zippo to do with processors. This is about using silicon techniques to cheapen the hideously expensive process of making optical switching equipment. They still haven’t addressed the fundemental problems with optical computing — namely that nobody has come up with good optical logic gates.
I smell efficient processors with a low heat output. *sigh*
I’m unsure if they wanted this as a replacement though for CPU’s. They talked as if it would be the greatest thing in distributed computing, not processing.
If Intel can prove their case and demostrate a faster chip that also demands less energy, using a different technique, than I’ll show them my credit card, but until than, I’ll stick with two year old processors.
The sooner we get optical processors the better. Theres never enough processing power.
— “They still haven’t addressed the fundemental problems with optical computing — namely that nobody has come up with good optical logic gates.” —
But you don’t think this is a good first step? Optical processors arn’t going to just pop out of a lab one day…
This is a step, certainly, and great for communications stuff, but doesn’t really bring us appreciably closer to optical processors. Certainly not enough to warrent carrying on like the article does. In the near term, it might be useful for things like really fast, interference-resistant optical busses inside computers, but until the optical logic problem is resolved, we’re still miles away from optical CPUs.
Besides, this has been done before. Intel’s new scheme is just (much) faster and closer to commercialization.
“In the near term, it might be useful for things like really fast, interference-resistant optical busses inside computers”
That in itself makes it worthwhile. Considering how dense computers are.
They still haven’t addressed the fundemental problems with optical computing — namely that nobody has come up with good optical logic gates.
Okay, you can officially tell me “RTFA, genius”
Although news of improved optical signaling speeds is always good to hear…
company from Isreal already announced something BETTER than this a few months back
I don’t believe in optical microprocessors.
Optical devices can be used for communications but the use of optical transistors leads to many problems :
– Efficiency of gates
– Crude silicon cannot be used for generating laser beams.
– Propagation along optical waveguides does not allow arbitrary interconnections as in chips metallisation layers.
– Theses optical fibers are now quite big comparing with current sub-micronic technology.
– Optical devices are not that much fast. Conventional transistors speed is related ( among others factors ) to electron mobility in the material ( which is higher in GaAs ). The electrical part of any optical gate is limited by the electrical factors. For transmission of infomation, light in a optical fiber is not significantly faster than electricity ( about 1/4 to 1/2 the speed of light in vacuum ), well, it’s actually only electromagnetic waves with various frequencies : Electrical connections propagates some ‘baseband’ signals whereas light is some kind of TeraHertz modulated signals…
Optical chips wouldn’t be faster nor with more integration nor more efficient. Light can be practically used only for interconnections, and that is all Intel is talking about.
among other things… good point
CPU speeds are plenty then need to come up with better everything else. How about a 0 wait state PC? 3.2GHZ cpu with 3.2GHZ main memory bus. Faster hard drives? or switch to solid state hard drives. When they announce technology like that then i will get excited.
Molecular memory might be commercialized…organic computers might be coming. I hope.
http://www.theinquirer.net/?article=13377
I don’t care what comes out. But I want lower power comsumption (no fans too, ideally), and far greater performance. The current trends are very disheartening.
It’s good to see AMD are going to release a 35Watt Athlon 64 though:
http://www.theinquirer.net/?article=14102
60+ watts is not nice for a cpu.
60+ watts is not nice for a cpu
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
150W+ or even 250W+ is a DISGUSTING waste of energy, electrical losses SHOULD only be 5-10% max (I know that is NEARLY improbable)
GHz should NOT be the deciding factor… blah blah blah… you all have heard it before
ROCKETDRIVES!!
pro:- as fast as the PCI bus can handle
con:- needs external power supply/ constant power (it iIS RAM after all
This has zippo to do with processors. This is about using silicon techniques to cheapen the hideously expensive process of making optical switching equipment. They still haven’t addressed the fundemental problems with optical computing — namely that nobody has come up with good optical logic gates.
Regarding optical switching equipment, I thought Lucent technology had already sorted that issue out over a year ago.
As far as I can tell from almost 0 article content, Intel has just built an optical transceiver in Si, this is a very small circuit that can act just like a buffer gate, electrical 1 side, optical the other side.
Traditionally Bell Labs-Lucent-Agere, and Conexant and BT and many others do this all the time, but it always used to be done with Gallium Arsenide type circuits which are much faster but pretty incompatible with silicon processing. GaAs and other 3-5 circuits have long been doing this at very high rates, atleast 10G and so on. The faster the circuit, the more esoteric the technology and further from Si tech.
A friend of mine developed Si cmos Optical tranceivers for low cost broadband using plastic cable. These parts were made for $1 a connection using std TSMC 0.18 cmos completely compatible with any cmos chip but were tuned for something like 100M-500M rates which is fine for plastic last mile to the home but too slow for the backbone stuff. Well the comms industry doesn’t want slowish $1 tranceivers, they want expensive fast tranceivers.
Connexant bought out my friends company and switched them over to faster bigger $profit designs, now they are shutdown IIRC.
What Intel needs though is cheap slowish tranceivers to speed up the peripharls of the PC. Si will always be slower than GaAS or InPh but is compatible with std cmos so this is fine.
I wouldn’t get too excited yet till more work is done. I see this as the physical way to get PCI Express (2.5Gbs serial) onto cheap plastic optical cable for external PCI devices like Raid or other high bandwidth devices. An optical tranceiver simply replaces an LVDS tranceiver and can in theory send data faster down the cable but the problem is that LVDS circuits are already pushing Si to the limit. Optical tranceivers would likely be slower than LVDS circuits even if the optical cable allows faster rates.
Optical IO would have some nice side benefits, no RF emissions from leaky cables, cable lengths not much of a problem, secure from snooping, and it fits in well with the trend to fully serial pt to pt architecture so it is just a swap in technology.
As for optical processing inside the chip, that remains as far out as ever except for special purpose imaging devices.
intel makes great things now and then, their creations are better implemented by other companies though. I don’t really like itanium though, sadly. I perfer a more risc-savvy CPU.. but who knows…
sounds like the first application for this would be faster optical switches and that it may make its way to interconnect in the future, meaning faster chips/bus.
but what i don’t like about these articles is that they say it could resolve bandwidth problems in telecom…it can’t
there are no bandwidth problems in the optical network in the US or most of western europe. there is a glut of capacity. The bandwidth problem is in the local loop and this does not help with that so much.
For the actual scoop on what’s going on I refer readers to this EE Times article which cuts out all the sensational cr@p and sticks with the facts.
http://www.eet.com/semi/news/OEG20040211S0029
Basically what it comes down to is Intel has developed a high speed optical modulator on silicon … these already exist on gallium arsenide. These will be somewhat cheaper now… they will not revolutionize the world… what they will do is make optical communications a tiny bit cheaper as this is one component upon hundreds that make up the system.
There are reasons why we don’t use fiber in homes: it’s fragile. You can beat copper to death. You can drag your sofa over a piece of CAT5. You can tie it in knots.
All data cables on all nasa equipment should be optical, or quantum optical. No EMF crap, and yes, kelvanic temperaturization, multiplexing, coloremetrizing(?), means this stuff dogs out everything else to date.
I don’t expect anything purely biological will find it’s way into usable nanotech. Perhaps- quasi bio. Yeah, I’ve read a few books on bacteriophages and stuff. interesting- I’d regulate it to quasi bio, unless you’re needing biomed technology for some reason.
You’re going to see a lot of interesting things popping up in the not so distance future. Optical or Quantum optical processing, frequency shifting a digital satellite packet header..
photovolatic principle will enable you to plug optical right up to someone’s house, run optical network, and backwards compatible with in house copper. There is tons, literally tons of things left unexploited. With optical processing, you end up cool cpu’s, not only that, you could stack ten of them on top of each other and not generate 1/2 the watt energy(or british thermals, whatever you are most comfortable with using) today’s cpu does. People just have to get over the idea that all light is incandecent, and therefore, hot.
And, yes, the industry does need to look into designing a line repair recoupler that can fix lines without busting the bottom line. That’s nessicary for fleet mobilization. I agree, with the problems of local loop, part of the way to hammer it out, is in this commenentary.
In theory, quantum computers can break almost all encryption algorithsms we have today. Can optical processors do that?
Regarding optical switching equipment, I thought Lucent technology had already sorted that issue out over a year ago.
Doubtful. Not only did Intel beat the previous switching rate record but this technology allows optical switching equipment to be sold at consumer-level prices.
an optical processor.
on the principal of using lasers, you don’t really have to, in the conventional sense. as for optical gates, that’s sorta addressable several ways, if you’re looking at it in binary sense. since light, like electricity, is additive you can derive a gate through several methods, figureing AND, OR, XOR, etc.
Heck, we can grow crystals. You can do it at home. And, getting the fibre junction is no problem, though, I don’t think cladding is the way to go, pressure injection of liquid fibre to dry, is also satifactory.
But as for going from power, to light, to light to power, simple. photvolatism is all you need.
Doing optical logic gates is *very* hard. People are having problems getting the gate types they need for CPUs (NOT is doable, NAND is much harder). Even the basic gates that they’ve got working are completely unsuitable for integrating by the millions into CPUs. Light loss is another problem for most of the current designs. A gate that causes a 50% light loss in the incoming signal is completely useless if you want to hook more than one together.