Linked by Thom Holwerda on Mon 27th Sep 2010 16:55 UTC
Geek stuff, sci-fi... "Watch a Nasa shuttle burning a path into space or a video of Saturn's rings taken by the Cassini satellite and it's hard not to marvel at man's technological prowess. But the surprising truth is that space exploration is built on IT which lags many years behind that found in today's consumer gadgets and corporate PCs. To this day, Nasa still uses elements of technology that powered the moon landings of the 1960s and 1970s, while the International Space Station - the manned station circling the Earth 250 miles above our heads - relies on processors dating back more than two decades."
Thread beginning with comment 442831
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Radiation...
by braddock on Mon 27th Sep 2010 17:46 UTC in reply to "Radiation..."
Member since:

Well, half the article was about ground station upgrades. Can only really blame that on good old government institutional inertia seen in many other agencies.

Reply Parent Score: 1

RE[2]: Radiation...
by ndrw on Tue 28th Sep 2010 03:18 in reply to "RE: Radiation..."
ndrw Member since:

It has probably more to do with a centralized funding.

The agency gets money for building some equipment, usually a lot more than they really need, build it and once it works their job is finished. No amount of potential improvement will convince government to fork some additional money. So the agencies focus on the development of new techniques, equipment etc. and neglect maintenance or improvement of the one they already have. After 20~30 years no one knows design details of that ancient stuff anyway, and even if they did they would be better off designing it from scratch (what they will at some point do, as a part of another brand new, big bucks project).

Reply Parent Score: 1

RE[3]: Radiation...
by ndrw on Tue 28th Sep 2010 03:58 in reply to "RE[2]: Radiation..."
ndrw Member since:

As for the technical issues of space applications, there are more than just radiation.

The largest one is not really technical but legal - chip qualification. The agency has to verify that the chip is robust enough to be used in a space flight - the easiest way to do it is to use the same chip that was used before. Manufacturers don't do this job easier by disclaiming any responsibility for non-standard applications (medical, military, space etc).

Technical issues are generally related to reliability over extended periods of time. On one hand new processes are inherently less reliable (more common manufacturing defects, doping profiles diffuse away faster, metal connections are more prone to electromigration, transistors - to hot carrier effects), on the other - these issues receive great deal of attention nowadays, with (semi-)automatic verification, testing etc being applied as a part of standard development flows. However, typical modern devices usually are verified (using models extracted from forced aging) for ~10years of continuous operations. Automotive, medical and other special applications go a bit further but they are usually lagging 4 or more process nodes behind (that's the time it takes to refine the process, characterize it from reliability point of view and to develop necessary tools and libraries for designing such chips).

Another issue is complexity. If our application requires a powerful CPU, millions of lines of code and we have no time for testing several chips, we simply shouldn't expect much.

On top of that comes radiation that indeed may cause spurious computation errors or trigger latch-up. It's not an unsolvable problem but one that is quite hard to test and affects so small numbers of applications that it gets very little attention from chip manufacturers. There are circuit and system level techniques to beat it but still we have to solve other issues.

Reply Parent Score: 4