When National Semiconductor decided to challenge Intel and Advanced Micro Devices in the market for low-end microprocessors in 1997, CEO Brian Halla teased a group of skeptical analysts, saying they probably thought he had been sprinkling testosterone on his corn flakes. Brian Halla predicts a technology transformation in which analog chips displace the zeros and ones at the heart of the binary language used in computing.
http://www.theonion.com/onion3311/microsoftpatents.html
As an electronic designer this is just crap – it has to be said. Analogue computing has been around for a very long time and it really can’t replace the digital circuitry – especially the CPU/ASIC.
This is just talking future shite to get your name in the press – god damn shameless really.
Perhaps analog chips will fetch part of the work in future pcs (e.g. in graphics boards). The user won’t care.
More interesting is the possibilty of build quantumn computers. These are cool architectures which beat tradtional computers in some areas (because they can make use of inherent paralleism).
Their basic information unit is a qbit, which is not 0 or 1, but a value on the (continous) unit circle – some number from the interval from 0 to 2*pi.
Regards,
Marc
Shure Analog Computers are Older than the digital ones …
And the binary system is soo simple ,
There is a reason why we use it .
To the question, does it matter (to use analog), he answered : my mother cannot open attachment.
Can somebody tell me since when usability of an email client will be solved by switching to analog
Typical CEO speak, he has no idea what he is talking about. If I ever hear anyone talk about how great analog computer are, then I will ask them to design a simple 8bit adder using just analog design…
I’ve been in this biz >20yrs so I have seen NS change its biz model many times. I remember well their NS16000 cpu, a Vax knock off much more interesting than the 8086 at the time. But they were in so many areas like Moto, TI that they had no focus and never had the spectacular growth rates of Intel. So NS dropped most all the digital product lines & TI went the other way to mostly DSP. Moto took way too long to specialise hence their current situation.
I respect NS as an analog house, many of the analog chips were designed by NS or x NS designers (Fairchild, Linear Technology etc). Only a few other companies even compete in analog because it is considered too boring by modern EE graduates who barely even know logic design at gate level let alone transistor theory. Most of the experts are old timers and don’t move around too much.
Now where does analog stand with digital. Well all the best digital guys that design the critical circuits by hand are technically analog designers too but using a limited subset of analog skills.
Every PC has quite a bit of analog in it, the Phase Locked Loop clock circuits, all the I/O pins are driven by devices that are barely considered digital, the various power supply regulation circuits, the onboard 300MHz DACs of the video cards, the AD/DA converters in the 56K modems, the audio chipset. Every memory designer that works on DRAM, SRAM, EPROM, thinks entirely in analog domain near the memory core, the result may be 1s & 0s but only after the signals have been boosted far above a few mV. The near busses, USB2, FireWire, SATA, fast SCSI, are all essentially analog channels.
Hard Disk are almost entirely analog at the disk surface, and like all communication devices rely significantly on error correcting codes to turn a few “maybe a 1 or 0” into single definative 1s & 0s with certainties greater than 10^12 etc.
Digital circuit design is just simplified Analog circuit design!
<
More interesting is the possibilty of build quantumn computers. These are cool architectures which beat tradtional
computers in some areas (because they can make use of inherent paralleism).
Their basic information unit is a qbit, which is not 0 or 1, but a value on the (continous) unit circle – some number from
the interval from 0 to 2*pi.
>
yea right, quantum computing is todays hot darling & much over hyped like Tunnel Diodes, Josephson Junctions, Bubble memories, before that.
Besides, inspite all the issues with power consumption of todays cpus, CMOS still has a way to go, probably another 1000 fold increase in density to come in next 10-20 yrs, but who knows how to get rid of the heat? The most interesting digital devices today IMNSHO are the FPGAs which essentially alow ASICs to be designed at modest costs & effort.
It is an analog world, period.
Who are all the retards here that know so much, obviously people that know nothing of which they talk.
If you actually read & understood the article, you would know that analog is well and truly alive and so well packaged inside digital you can’t see it.
it’s all A to D and then D to A because humans are analog.
“Typical CEO speak, he has no idea what he is talking about. If I ever hear anyone talk about how great analog computer are, then I will ask them to design a simple 8bit adder using just analog design…”
That would be a sliderule
“Typical CEO speak, he has no idea what he is talking about. If I ever hear anyone talk about how great analog
computer are, then I will ask them to design a simple 8bit adder using just analog design…”
The fastest adder for such a short chain is ANALOG you idiot, ever heard of a manchester carry chain, know whats a transfer gate is or distributed RC network is. If you are one of those logic designers that synthesizes a+b for an adder, then you couldn’t possibly know.
If you actually read & understood the article, you would know that analog is well and truly alive and so well packaged inside digital you can’t see it.
The provocing part of the article is that analog chips will take over work which is now done by digital chips. “Goodbye to ones and zeros”.
yea right, quantum computing is todays hot darling & much over hyped like Tunnel Diodes, Josephson Junctions, Bubble memories, before that.
It’s laughable to compare a revolution like QC to the mere technical improvemtents TD, JJ and BM. QC joins the fundamentals of computer science (which is very mathematical already and deeply connected to logics) with the fundamentals of physics.
It is really a new beast with a quite advanced theory.
If it is practical within the next 20 years – might be, might be not.
It is an analog world, period.
It is an quantumn world, not a classical one.
Regards,
Marc
you know, early analog monitors were ALOT better at displaying colors, 32 bit actually came some time later for digital ones.
This is also why early digital cameras sucked, no colors
“In the process, Halla expects analog chips will displace the zeros and ones that have formed the heart of the binary language used in personal computing for most of the last couple of decades.”
The above paints a very different picture from what the rest of the article seems to be saying.
From the sounds of it, it seems the article is arguing that the analog semiconductor market is growing but the industry is not moving to accomidate the increasing market.
This market is, however, primarily in the chips which provide some sort of network datalink layer. He talks about the move to spread spectrum wireless network technologies, and providing chipsets which support multiple wireless networking standards. I think no one will disagree with this, already there exists the 802.11g standard which supports two different datalink methodologies (albeit on the same frequency)
Unfortunately, I think the article’s author misunderstood these points, and instead is incinuating that there will soon be a shift within computing to allow for native processing of continuous datasets, rather than sampling a continuous dataset to produce a discrete one. This is just my interpretation… the wording within the article is so misleading and logically inconsitent that there are certainly a number of others which could be gleaned from the article.
Many here seem to be interpreting the article as a case for the increasing complexity of representing a discrete dataset within a continuous medium as clock speeds rise. However, I see no forthcoming shift from this paradigm… computers will continue to process discrete datasets stored in a continuous medium.
Perhaps the article is saying that the “analogueness” of the underlying medium is becoming more pronounced as datalink methodologies grow increasingly more complex.
Regardless, this article seems pointless. Its writing was all over the place, and its non-technical discourse on this highly technical matter made it all the more confusing, and probably completely undigestable by your average joe. I certainly didn’t understand the specifics of what it was getting at, and can only veture guesses, most of which are self-evident anyway.
Man.. some ppl are really morons who don’t know what they are talking about. Analog is all about NOT using bits, but variable ranges of infinite precision (and imprecision).
You can have a simple adder using an amplifier circuit with two inputs. The equivalent of your “8bit” value (0 to 255) could be matched with analog values from, let’s say, 0volts to 2.55* volts.
If you don’t know what you are talking about, better stay out of it.
*the value was selected to make it easier on the clueless reader, but as any analog circuit engineer could tell, this value/range could be anything the engineer sees fit for the application.
Because analog chips are physical items and ones and zeros is a data representation. Perhaps the author (Eugenia quoted the CNet article’s author) meant that analog chips would displace digital ones? Seriously, the amount of bad grammer on the net is getting out of hand. I’m hardly picky about proper grammer, but there is a line between making an occasional mistake and just being careless.
Your first post was great, and extremely informative. It likely as not put anyone in their places. Now you’re just frustrated. Remember to breathe before you type
–JM
<The provocing part of the article is that analog chips will take over work which is now done by digital chips. “Goodbye
to ones and zeros”. >
Ok so the title suggests that (silly title indeed), but the article and NSs brief is not that. NS exists to supply the analog niche that exists in every nook & cranny of so called digital world. It wasn’t that long ago that the volume of $ digital displaced $ analog sales, but I am showing my age, & NS WAS similar in size to Intel etc.
“It is an analog world, period. ”
“It is an quantumn world, not a classical one.”
Ok, from a Physicists point of view, you are abs right, it is a quantum world, and I could say that analog is the way we as engineers perceive it. How ever, I have never seen or met an engineer that designs according to quantum principles, thinking analog is hard enough, digital is just a simplification necessary to get 1000s time more work done.
BTW, you don’t see or hear the Q word used much at any circuit conferences I used to go to, & I don’t think thats changed. Show me a QC computing device I can buy or see, its 20yrs away & more. I’m always reminded of perpetual motion engines when I hear about QC, but actually try to get results out of one and the thing falls apart.
I don’t think anybody believes that analog computation as per WW2 time frame is practical, but even recently split gate CCDs (mostly analog) were used for ASP (analog sig proc), ie FFTs, filters etc. But the world would rather keep the fabs full with nMil gate DSPs than smaller direct ASPs, buts thats another story.
“you know, early analog monitors were ALOT better at displaying colors, 32 bit actually came some time later for digital
ones.”
yea right, all tube monitors are analog period, and have gotten better, not worse with better cheaper higher bandwidth analog ICs. What lies between the monitor RGB video buffers and the video board DACs is usually cable. There is quite a bit of digital stuff inside most monitors though to help improve product.
“This is also why early digital cameras sucked, no colors”
yea right, The early digital cameras sucked because they were 1st generation ie much lower resolution than recent offerings. Also if you are refering to CMOS cameras then they double sucked and still do except for cheapo webcams. The modern CCDs are still basically same as old ones, just much better resolution ie far more pixel sites, better noise etc. Again, all digital cameras ARE ANALOG, they just convert varying photon counts to electron charge, scan & convert the analog signal stream to digital for further processing, enhancement, compression, storage, transmission etc etc etc.
Regards
JJ
I saw Hale’s http://www.comdex.com/news/fall2002/index.php?d=keynotes&s=common&c… on this (along with other) topics. He’s no dummy, he’s a techy in a high management position. A lot of what he talked about were visualization systems, it just so happens that the thickness of chips is getting down to around the thickness of the visable spectrum. So we can do for visual signal processing what occured for analog music 20 years ago. He has products that use this technology (he gives a good example of a digital camera with 9x the resolution you should normally get).
Anyway just listen to the speech.
BTW, you don’t see or hear the Q word used much at any circuit conferences I used to go to, & I don’t think thats changed. Show me a QC computing device I can buy or see, its 20yrs away & more. I’m always reminded of perpetual motion engines when I hear about QC, but actually try to get results out of one and the thing falls apart.
What they have in the labs are bizarre large appartus (for example NMR spectroscopes) that can realize less than a dozen qbits. If no one figures out how to create systems with large numbers of qbits economically, this will not leave the physics labs.
Regards,
Marc
I think that some are missing the idea of what analog means here.
Noone in their right mind would suggest that an adder built with analog techniques would produce a voltage output with 256 discrete levels. The idea of multi level logic never got much further than a fad, but there are stll places to find uses for it.
What I am saying is that the fastest circuits are built as transister level circuits that are in essence analog circuits that as quickly as possible produce digital results. Saying that a digital circuit is not in actuality an analog circuit is moronic. When 2 digital designers design a small circuit, the one with more device/analog knowledge is going to produce faster, smaller, lower poer design. Analog design doesn’t just mean op amp design, it includes all low level digital design even if there are only 2 final desired states.
Even the lonely inverter is also a linear amplifier, but it isn’t usually used as such. But the same principles still apply when that inverter switches state. As for the fastest adder designs for short length, the manchester carry chain is still a distributed switched RC network with less total delay than pure simple gate designs and is always treated as analog.
Now in Chill mode
Saying that a digital circuit is not in actuality an analog circuit is moronic.
There is nice example of that a digitial circuit is of course in reality an analog circuit:
Some crazy guy used evolutionary optimization (like genetic algorithms) on a couple of programmable logic cuircuits (FGPAs) to farm a design that should recognize the spoken words “yes” and “no”. He ended up with some weird cuircits that for example had spiral like structures that would only make sense if some wave effect would apply. Also the designs seemed only on individual FGPAs, not for a whole series – thus it exploited features of the individual chips.
So what seemed to have happened is that his optimization run did not restrict itself to his intended search/optimization space of well defined discrete logic chips but rather ran over the full physical (and thus analog) properties of the real FGPA chips.
Regards,
Marc
The article is interesting but fact is that the opposite is happening in circuit design. Engineers and OEMs are anxious to get rid of as much analog as they can.
Just look at cell phones and everything radio. EVery year designers absorb more, formerly analog, functionality into digital land. How far are we from a software defined radio that keeps analog to a minimum, probably about 5 years.
Now that said, the most efficient storage process in the world is not digital it is chemical and it is called DNA so the idea that means of computing will change and not be so dependent necessarily on 0s and 1s is definitely interesting to think about
this is worse then the post about why isn’t the internet multiuser. osnews editors, please don’t sink to the level of posting every little stupid peice of crap like some other IT websites i could mention…
*cough* slashdot *cough*
Theoretically, analog systems have the potential for higher performance but probably will not replace any significant portion of the digital space any time soon. The reasons: design and testability. Designing analog systems is tough business. Digital systems design is comparatively easier. Testing. Analog systems are much more susceptible to cross talk, interference, etc. When your design/testing cycle grows, your time to market is delayed.
you know, early analog monitors were ALOT better at displaying colors, 32 bit actually came some time later for digital ones.
Any CRT you have that is connected to a DB15 port on your video card is analog Only recently have their been moves to impart digital output on video cards. Those connetors only work on the newest LCD monitors, though. So, realisticly, digital video output from PCs has only been around for ~18 (24 tops) months.
I’m really confused how you think that either a digital or analog monitor is better at displaying 32bit color. The last 8 bits of 32bit color are transparent. The alpha channel information is used by software to calculate how transparent a given area is. The monitor could care less.
Actually early PC monitors were DIGITAL, the CGA standard used digital signals to encode color information, I think with VGA analog signaling was introduced… due to the larger color space and so on.
To the early people who called me a moron and so on. I understand that some of you just had your first EE analog class, and you are all excited and what not. Sure now you have played with OpAmps and… oh my god you actually know that digital circuits are actually analog devices once you think about it, congratulations! But most of you lost the actual meaning of my post. I never claimed that digital was not analog, the post simply pointed to the fact that a computer is a PROGRAMMABLE machine, and purelly analog designs may be faster and more accurate and what not, but they are still inherently harder to both design and test. And sometimes dealing with the programmability of analog components can be a challenge. Thus the production a programmable machine using purelly ananlog design approaches can be difficult at best… There is an actual reason for digital design, stay in school a couple more years and you may actually find out.
Cheers…
It took years for computers to fit in small boxes instead of big rooms. Geez, quantum computing research is progressing, I don’t think you can dis it so easily.
“Theoretically, analog systems have the potential for higher performance but probably will not replace any significant portion of the digital space any time soon. The reasons: design and testability. Designing analog systems is tough business. Digital systems design is comparatively easier. Testing. Analog systems are much more susceptible to cross talk, interference, etc. When your design/testing cycle grows, your time to market is delayed.”
True, although I think one can take a few lessons from biological systems, on how to deal with some of those issues.
Now that said, the most efficient storage process in the world is not digital it is chemical and it is called DNA….
DNA’s usable information is most definitely digital, stored in a sequence of four different kinds of nucleotides. Also, it is not the most efficient storage process in the world. Scanning tunneling microscopes allow for the placement of single atoms.
Talking about an 8 bit analog adder is like doing all your normal, everyday written math in binary. You don’t do it like that, you use decimal. Same sort of thing. Analog doesn’t *work* in bits, it uses actual quantites. So an actual analog adder would be two potentiometers, or n pots, together in series, with an ohmmeter recording the total resistance. R1 on pot 1 + R2 on pot 2 = Rt on the reading. There.