Linked by Thom Holwerda on Fri 27th Dec 2013 20:03 UTC
Hardware, Embedded Systems

The CDC6600 and its family members are part of the computer industry history. A decade before the Cray 1, the members of the CDC6000 family were not only expensive and the most powerful systems at the time of introduction. They were also lean and wonderful architectures ! The elegance was also conveyed in several publications (many by CDC), where all the necessary knowledge was explained from the ground up. All you needed to know was supplied, clearly laid out, not just hints for efficient programming. Basically, you could rebuild your own computer by reading these books. 50 years later, they are invaluable reminders and tools, we can see where the computer industry comes from and realise that it is not that hard to do it ourselves, too.

Via HackerNews. This is an amazing resource.

Thread beginning with comment 579550
To read all comments associated with this story, please click here.
by transputer_guy on Fri 27th Dec 2013 23:40 UTC
Member since:

The CDC was a little bit before my time. At Uni I used DEC KL10s fronted by older ICL1904s for I/O support, ie card scanners and line printers and programmed in Algol60, BCPL, Fortran and esp lovely Asm etc.

Not long after graduating I heard rumors of a KL10 in a suitcase, some company other than DEC had reduced a large room of ECL hardware into a custom chip. Today this would be almost a trivial exercise for any computer architect graduate with FPGAs, but the software, what would be interesting enough to run on it.

The other thing to appreciate is code size, today I notice that several Mac apps like iTunes, Acrobat Reader are in the 100s of MB of code, while back in the day, the entire mainframe memory would have been a MB or so if that, and possibly not even semiconductor, ie ferro core or wire etc. Even the large platter drives might only have stored a few MB each.

Knowing all this sort of now holds me back, I can't stomach to think why code is so bloated, what is going on, is it all just 512^2 pixel maps to make it look darn pretty.

Reply Score: 4

RE: nice
by acobar on Sat 28th Dec 2013 00:48 in reply to "nice"
acobar Member since:

Perhaps, programs were short because they were coded to do one thing and do it well, as computing time was expensive and you had to allocate time to use terminals or card punchers. I remember people trying all crazy optimizing techs they could devise.

Things changed a lot, most coders don't try to optimize anything and even ask for faster computers frequently so that their code, which sometimes should have been improved, can run at an acceptable pace.

There is also the code related to GUI and input filtering. Where there was crap output if invalid data was supplied, now we have all kinds of check-ups, of course, they have a cost.

And last, but not least, people put all type of things, even barely justified ones, on their code for the sake of "completeness", be whatever it may mean on their mind.

Reply Parent Score: 5

RE: nice
by tylerdurden on Sat 28th Dec 2013 01:04 in reply to "nice"
tylerdurden Member since:

I can't stomach to think why code is so bloated

That's because there is nothing to stomach. Modern software does orders of magnitude more things than trivial manipulations of very limited data sets requiring just little sequential IO.

Increased functionality leads to increased productivity, with increased complexity as well... since nothing is "free" in this universe ;-)

Yes, plenty of modern software sucks. But so did plenty of binary fodder from "the good ol' days."

Edited 2013-12-28 01:08 UTC

Reply Parent Score: 2

RE: nice
by terra on Sat 28th Dec 2013 02:05 in reply to "nice"
terra Member since:

I do not think in ol' days no one could ever build so much features tucked in into one application like those you have mentioned with all the resources. You can still program in assembly language to code a program that does only a couple of things in a fraction of size. But do you think it is worthy?

Reply Parent Score: 4

RE: nice
by pooo on Sat 28th Dec 2013 06:49 in reply to "nice"
pooo Member since:

give me a break.

Reply Parent Score: 1

RE: nice
by Drumhellar on Sun 29th Dec 2013 03:11 in reply to "nice"
Drumhellar Member since:

Knowing all this sort of now holds me back, I can't stomach to think why code is so bloated, what is going on, is it all just 512^2 pixel maps to make it look darn pretty.

Well, in the case of iTunes for OSX, the actual executable + bundled libraries is about 64MB - and contains a boatload of feature. All the different views, support for every iPod (even going back to the original), support for the store, managing the music database, music encoding and decoding, error handling, parental controls, a dynamic and customizable interface supporting a number of view modes, and more. And, it also supports two architectures - i386 and x86_64.

Is it bloated? And, if so, is it bloated from poor code density, or is it bloated because it contains features you don't use?

The bulk of is contained in localization. 34 languages are supported, totaling up to 139MB.

There's a ton of graphics - device icons have a dark and light version, and file-type icons go extremely high resolution - there's about 25MB of graphics (Maybe more - I didn't look too hard)

The iTunes help file alone is 24MB.

So, better code optimization could bring down the size of one of these things - the actual executable size, but is this important? For an app like iTunes, when 99% of what the application is doing is waiting for the user to do something, would there be a noticeable benefit?

With the amount of processing power and available memory of modern systems, especially when compared to the cost of today's systems, is absolute code efficiency even that important? It certainly was when a megabyte of RAM cost thousands of dollars, but for pennies per megabyte, especially when the application isn't performance sensitive?

I'd say no.

Reply Parent Score: 5

RE[2]: nice
by JPollard on Sun 29th Dec 2013 16:44 in reply to "RE: nice"
JPollard Member since:

You still pay for it... more paging activity, more I/O, slower startups...

And a LOT more error prone.

The problem is that mixing user interface and application makes for a very bug hospitable environment.

And that translates into system vulnerabilities.

That is what modularity is supposed to address.

Unfortunately, throwing everything and the kitchen sink into an application make a poorly performing program.

Reply Parent Score: 2