Linked by Thom Holwerda on Sun 10th Jun 2018 23:31 UTC

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like "decimal machines" and "2 out of 5 code". It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

Order by: Score:
non-binary
by jessesmith on Sun 10th Jun 2018 23:35 UTC

Member since:
2010-03-11

I remember learning about (but never using) non-binary computers in college. The basic idea seemed to be that you could store multiple voltage levels and use those levels to represent base 3, base 8, base 10, or whatever you wanted. The problem tended to be that multi-level voltage systems were not always reliable (tended to burn out) and error prone (you might store a "3" and get back a "4"). The binary on/off approach was, as I understand it, more reliable and won out over the other systems.

RE: non-binary
by Alfman on Mon 11th Jun 2018 06:43 UTC in reply to "non-binary"
Member since:
2011-01-28

jessesmith,

I remember learning about (but never using) non-binary computers in college. The basic idea seemed to be that you could store multiple voltage levels and use those levels to represent base 3, base 8, base 10, or whatever you wanted. The problem tended to be that multi-level voltage systems were not always reliable (tended to burn out) and error prone (you might store a "3" and get back a "4"). The binary on/off approach was, as I understand it, more reliable and won out over the other systems.

The on/off bits are the easiest form of discrete logic to implement because there's only one critical threshold, at the other extreme: analog computers. Obviously many algorithms need exact discrete values for which analog is unsuitable, but for those where an estimate will do, analog adding circuits can be made significantly faster & denser than binary ones.

Many operations can be done directly in analog form. Instead of large bit buses and full adder circuits, which add values bit by bit while cascading carry bits, we might just use a single wire carrying an analog voltage while adding/subtracting them using opamps.

https://www.electronics-tutorials.ws/opamp/opamp_1.html

Division, which is a relatively time consuming operation in binary, can be achieved in analog using voltage divider networks.
https://en.wikipedia.org/wiki/Voltage_divider

Obviously analog was used for high speed circuits long before digital circuits became fast enough to handle them with discrete logic. Now virtually all analog computers have been replaced:

https://en.wikipedia.org/wiki/Analog_computer

We've focused heavily on digital algorithms and problem solving. Yet I wonder if there might be some useful applications in the future for analog computation. Theoretically an analog graphics card could be faster and use less power than a digital one. Also, our brain's neurons are not digital but analog. The neural nets that we use in AI applications use floating point, but conceivably could run faster on an analog computer.

Apparently some people in the research community are trying it:
https://insidehpc.com/2018/02/mit-helps-move-neural-nets-back-analog...

Edited 2018-06-11 06:45 UTC

RE[2]: non-binary
by kuiash on Mon 11th Jun 2018 16:29 UTC in reply to "RE: non-binary"
Member since:
2018-05-21

Nicely explained.

Here's Scanimate. An analogue video processor. Awesome!

Any synth from the 70's stuffed full of CEM/SSM chips is pure analogue "computation" heaven - especially if controlled by something like the Korg SQ-10 (analogue step sequencer). Even the SID chip is part analogue with digital control.

Apparently there have been a few attempts at balanced ternary computer systems. Never, ever seen one.

RE[3]: non-binary
by Alfman on Mon 11th Jun 2018 17:34 UTC in reply to "RE[2]: non-binary"
Member since:
2011-01-28

kuiash,

Nicely explained.
Here's Scanimate. An analogue video processor. Awesome!

That's really fascinating. Here's another video with the engineers explaining scanmate.

RE[3]: non-binary
by zima on Wed 13th Jun 2018 21:37 UTC in reply to "RE[2]: non-binary"
Member since:
2005-07-06

Scanimate reminded me about Atari Video Music which apparently was also analog:
https://en.wikipedia.org/wiki/Atari_Video_Music
Best part of that Wiki article:

According to Atari design engineer, Al Alcorn, when Atari was on tour promoting the device, a Sears representative asked what the developers were smoking when they invented it. With that, a technician stepped forward holding up a lit joint.[8]

:D

Apparently there have been a few attempts at balanced ternary computer systems. Never, ever seen one.

Check out Wiki links I gave in a reply to Alfman just below.

RE[2]: non-binary
by zima on Wed 13th Jun 2018 21:35 UTC in reply to "RE: non-binary"
Member since:
2005-07-06

Balanced ternary, of https://en.wikipedia.org/wiki/Setun (and from https://en.wikipedia.org/wiki/Ternary_computer "it had notable advantages over the binary computers which eventually replaced it, such as lower electricity consumption and lower production cost.[3]" ...it might have merit) or negabinary (-2 base, Polish Elwro computers: https://en.wikipedia.org/wiki/UMC_(computer) ) seem at least easy enough to be implemented quite early on in the computer revolution, when every electronic component was at a premium; and -1,0,1 seems similarly reliable (what jessesmith was focusing on in his comment) to binary.

As for analog computers: though aren't they harder at "reprogramming"? - it requires rewiring them; there's nothing like Arduino, I think...
(BTW Setun, Soviets were also very much into analog computers IIRC)

RE[3]: non-binary
by Alfman on Thu 14th Jun 2018 01:27 UTC in reply to "RE[2]: non-binary"
Member since:
2011-01-28

zima,

As for analog computers: though aren't they harder at "reprogramming"? - it requires rewiring them; there's nothing like Arduino, I think...

You can route analog signals similarly to digital ones. They sell analog latches and switches in IC form such as this one:

http://www.datasheetcatalog.com/datasheets_pdf/D/G/5/0/DG5043CJ.sht...

Even plain conventional bipolar junction transistors could work.
https://en.wikipedia.org/wiki/Bipolar_junction_transistor

Arguably some CMOS logic gates are actually better at analog than digital. Take a cmos hex inverter, which we think of as ON or OFF, but in reality the OUTPUT ~ VCC - INPUT with a fair amount of linearity.

MOSFET, which came about later, is better for digital logic and worse for analog.

You mention arduino. If you look at the schematics for the AVR chips used in arduino, you'll see that the analog pins are sharing one DAC and the input signals are routed through a dynamically programmable channel mux unit.

With all this in mind, it could be possible to have a kind of analog FPGA using analog signals rather than digital ones. The analog paths could be rewired programmatically in the FPGA fabric. You'd loose accuracy, but gain speed. DAC and ADC conversions are relatively slow however. So it would be most useful in projects where the input/output are both analog. DSP applications come to mind, although it's ironic given what the acronym stands for

(BTW Setun, Soviets were also very much into analog computers IIRC)

If you've got an article, submit it to osnews!

RE[4]: non-binary
by zima on Sun 17th Jun 2018 23:50 UTC in reply to "RE[3]: non-binary"
Member since:
2005-07-06

This ~electronic stuff seems way over my head... Maybe one day.

>(BTW Setun, Soviets were also very much into analog computers IIRC)

If you've got an article, submit it to osnews!

It was just a tidbit I read somewhere once...

I do know that Soviet/Russian rockets were for the longest time analog, even with some curious "limitations" of sorts stemming from simplicity - for example, the R-7 family of rockets (the most succesfull rocket family ever, with close to 2000 launches - starting with R-7 the first ICBM, through Sputnik launcher, Vostok and Voskhod rockets, to current Soyuz launchers: https://en.wikipedia.org/wiki/File:Roket_Launcher_R-7.svg ...yes, first launch over 60 years ago; considering a new launch complex was built just a few years ago in French Guiana, a century of service seems well in R-7 grasp.) originally couldn't change inclination in flight, couldn't "turn"/roll ...what turned for the desired orbit was the launch platform itself (though latest gen digital Soyuz can turn; Soyuz launch complex in Kurou in French Guyana has fixed launcher).

But the rockets and their control systems are very robust, able to launch in almost hurricane-level gusts of wind (common on the steppes of central Asia), in all extremes of temperature and humidity (summer and winter of Kazakhstan, far north of Plesetsk kosmodrome, Kurou near equator).

And R-7 control system allows for marvellously simple way of jettisoning booster rockets - they aren't attached by any hardpoints, the rocket simply "hangs" on them as they push it forward, and at the end the control system stops the flow of propellants to booster engines at precisely synchronised moment, so they simply fall off at exactly the same time; without a glitch (forming https://en.wikipedia.org/wiki/R-7_(rocket_family)#Korolev_Cross )

Though all that is not really OSNews material...

Edited 2018-06-17 23:51 UTC

Comment by Delgarde
by Delgarde on Mon 11th Jun 2018 09:40 UTC

Member since:
2008-08-19

One of the more interesting bugs I remember encountering was as a junior developer sitting with an experienced guy... the database had somehow added a bunch of numbers and instead of summing to 3.0, had somehow produced a total of "2.10"... that is, the second 'digit' was more than 9.

This was my introduction to binary-coded decimal...

Edited 2018-06-11 09:45 UTC

Yes, I'm old.
by fretinator on Mon 11th Jun 2018 13:10 UTC

Member since:
2005-07-06

I actually used an analog computer in a work setting. I performed pulmonary function tests, and our test for airway resistance used an analog computer to analyze the waveforms produced by panting in a closed box (plethysmography).

RE: Yes, I'm old.
by Alfman on Mon 11th Jun 2018 15:09 UTC in reply to "Yes, I'm old."
Member since:
2011-01-28

fretinator,

I actually used an analog computer in a work setting. I performed pulmonary function tests, and our test for airway resistance used an analog computer to analyze the waveforms produced by panting in a closed box (plethysmography).

That's awesome. I've never seen an analog computer myself, it would be awesome to play with one to understand it better. I've never even used a slide rule, which is a sort of analog multiplier/divider, but my parents did.

https://en.wikipedia.org/wiki/Slide_rule

Understanding binary
by Vanders on Mon 11th Jun 2018 17:36 UTC

Member since:
2005-07-06

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

I can not recommend reading the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold more highly. Go, now.