“Watch a Nasa shuttle burning a path into space or a video of Saturn’s rings taken by the Cassini satellite and it’s hard not to marvel at man’s technological prowess. But the surprising truth is that space exploration is built on IT which lags many years behind that found in today’s consumer gadgets and corporate PCs. To this day, Nasa still uses elements of technology that powered the moon landings of the 1960s and 1970s, while the International Space Station – the manned station circling the Earth 250 miles above our heads – relies on processors dating back more than two decades.”
Probably the biggest reason this older technology is background radiation. Space shuttles don’t have the luxury of two miles of atmospheric radiation shielding (one mile above Antartica and New York City), and with that much more radiation on a chip, modern processors are much more likely to have “schrodenbugs” appear. Not something you want on the chip you absolutely must have working when you land your ship.
(If I felt like writing some alt-hist, I’d write the story of how cheap space travel made Linux dominant due to Windows and OSX not being able to back-port back to the 486s and lower that spaceships run on. I doubt I’m the right person to write the great open-source space opera, however.)
Edited 2010-09-27 17:21 UTC
Well, half the article was about ground station upgrades. Can only really blame that on good old government institutional inertia seen in many other agencies.
It has probably more to do with a centralized funding.
The agency gets money for building some equipment, usually a lot more than they really need, build it and once it works their job is finished. No amount of potential improvement will convince government to fork some additional money. So the agencies focus on the development of new techniques, equipment etc. and neglect maintenance or improvement of the one they already have. After 20~30 years no one knows design details of that ancient stuff anyway, and even if they did they would be better off designing it from scratch (what they will at some point do, as a part of another brand new, big bucks project).
As for the technical issues of space applications, there are more than just radiation.
The largest one is not really technical but legal – chip qualification. The agency has to verify that the chip is robust enough to be used in a space flight – the easiest way to do it is to use the same chip that was used before. Manufacturers don’t do this job easier by disclaiming any responsibility for non-standard applications (medical, military, space etc).
Technical issues are generally related to reliability over extended periods of time. On one hand new processes are inherently less reliable (more common manufacturing defects, doping profiles diffuse away faster, metal connections are more prone to electromigration, transistors – to hot carrier effects), on the other – these issues receive great deal of attention nowadays, with (semi-)automatic verification, testing etc being applied as a part of standard development flows. However, typical modern devices usually are verified (using models extracted from forced aging) for ~10years of continuous operations. Automotive, medical and other special applications go a bit further but they are usually lagging 4 or more process nodes behind (that’s the time it takes to refine the process, characterize it from reliability point of view and to develop necessary tools and libraries for designing such chips).
Another issue is complexity. If our application requires a powerful CPU, millions of lines of code and we have no time for testing several chips, we simply shouldn’t expect much.
On top of that comes radiation that indeed may cause spurious computation errors or trigger latch-up. It’s not an unsolvable problem but one that is quite hard to test and affects so small numbers of applications that it gets very little attention from chip manufacturers. There are circuit and system level techniques to beat it but still we have to solve other issues.
Well I suppose that depends on who you buy from. You ca avoid this and get guarantees from the manufacturer. Of course, you pay for it
Leon3FT 32bit Sparc processors have redundancy and radiation tolerance built in… open source hardware design ftw….
Technically the latest Leon4 isn’t open yet hopefully it will be though. The Leon4 runs at at 200Mhz on FPGA development boards and upto 1.5Ghz if ASIC tech were used. NASA should definitely throw a few million at this to get the newer designs rad tolerant and implemented in ASIC tech a much better usage than what they usually throw billions at. The ESA already contracted for the Leon3ft… I don’t see why NASA shouldn’t pitch in a bit here too.
Yeah we use alot of the LEONs (Germany). Right now I’m developing on a Leon3FT. Ok the FPGA runs at around 20Mhz, which is very slow compared to a desktop processor, but with the amount of exploitable parallelism we can achieve very high throughput.
Edited 2010-09-28 07:18 UTC
NASA chose Linux to run the control computers for the New Horizons probe (Pluto mission) for exactly that reason: Given that their control software would have to communicate with a space probe 9+ years after launch, they needed some assurance that the software would still run on the supercomputers of the distant future. With Linux, they can tinker with any/every aspect of the OS to make sure everything still works. When New Horizons was launched in 2006, we were still on GTK+2.8, QT3.3, Firefox 1.5, everyone was still using Pentium 4s (or equivalent), Windows Vista hadn’t been released yet, and the iPhone and Android didn’t exist yet.
On the other hand, NASA is going to have these problems, because they’re still supporting spacecraft at least as old as the Voyager probes (launched in 1977) with computers equivalent to the Atari 2600 and such little memory that they have to be reprogrammed remotely nearly any time they’re asked to do something different, AS WELL AS stuff like New Horizons that can basically fly themselves, only phoning home at prearranged checkin times or when it has problems. (There have been large improvements in spacecraft software in the last 30 years, though not as big as the rest of the technology sector)
I quit reading the article on page 2. Because I suddenly got so tired of web pages being so bloated compared to the actual content I’m reading. Some compare the content to screen space ratio. I just happened to have activated the “Info” panel in Opera and, I saw these size figures:
page 1: main page: 11056 bytes, inline elements: 690770 bytes
page 2: main page: 11121 bytes, inline elements: 594014 bytes
That’s too much for 30 lines of text. With images that are worthless in regard to the topic at hand.
Second thing, why can’t the whole article be on a single page? I find it impractical that articles are chopped in that manner.
I know this rant has very little to do with the actual article but I’ve been noticing that the web is getting handled the way software has been handled these last years. Sloppiness seems to have become a standard. So much faster connections and yet, I don’t see much improvement. So much faster CPUs and yet, no real benefit. I didn’t even bother to launch Dragonfly but I wouldn’t be surprised if, in addition to weighing too much for the content that’s useful to me, the pages were tagged with no-cache. Can someone more enlightened than me provide some explanations about what is going on? Has anyone else noticed the same trends?
Yeah, software bloat should be punished by death penalty, but well… That’d be hard to apply the sentence in practice in a world where doing things in a profitable way is more important than doing them right.
Seconded. I don’t even bother reading articles that are chopped up that way anymore. What’s the point of that? More clicks, more page loads, and thus more hits or something? There’s no reason for it; this is the web and not a damned print magazine. As for the inline elements, yeah the situation is bad. How I deal with it is: turn on noscript, Adblock Plus, ghostery (can block inline frames), flashblock, and turn auto-load images off. Then if I need to get access to any of the above I can, otherwise I don’t waste time downloading or rendering it. This is the prime reason that, even though Chrome is based on the faster Webkit engine, I’ve stuck with Firefox myself. Without these addons, the web has become so cluttered that it’s almost physically painful.
easy way around that is to click the “print article” button on most sites. It formats the entire article on one page, usually 90% reduction in advertising (or a complete reduction).
It’s the only way to go…
Thanks for the tip!
The last quote of the article I find offensive.
“Right now it takes a dozen and a half people to run the building and all of its systems – our vision is that one day we could control the entire building with two people.”
IMO 20 or so people is already understaffed for an organization as large as NASA… 2 people probably isn’t even enough to maintain all the hardware let alone make sure the system is secure.
They mentioned how hard it is to find fortran programmers etc…. thats be cause they aren’t planning ahead and hiring more software/computer enginneers that at least have an inclination to learn how to program in languages no longer commonly used commercially or taught in schools… if you don’t train good programmers you’ll never have them.
Maybe this is what you meant, but I would say that instead of looking for fortran programmers, they should look for good programmers who are willing and able to learn fortran. I know that there are unemployed C & COBOL code jockeys who would gladly take the money.
This, and it also demonstrates the generational effects of computing… I’m in the sciences and MANY of our programs are written in FORTRAN-77 (and many of the professors only know FORTRAN-77, and possibly RSI IDL) simply because FORTRAN-77 was far and away the most useful language for computational science back when computers became massively available to scientists (much in the way COBOL attracted the business community).
Times have changed and languages have improved, but FORTRAN-77 continues to be used because that’s what people know, and as we’re in a business where your answers matter more than the programming language you used, there’s not much incentive to retrain everyone to use C/C++/Java/Python/IDL/etc. The fact that GCC deprecated its G77 compiler has caused a bit of grief, though… That and I have a hard time because I don’t know FORTRAN-77.
Heck, they could call me! I started with FORTRAN, Pascal, COBOL and BASIC. And some assembler. That would make for a fun job…
http://www.silicon.com/i/s4/illo/photos/2010/August/Nasa+Esa/610_es…
He has a cup full of coffee next to mission critical computer systems and it looks like he’s having a nap in his chair, while his coworker to the left is frantically typing something on the keyboard (probably trying to land Ariane 5 Flight 501)
Wow, you were able to get all that from a single picture pertaining a situation about a field you have no remote contact or knowledge of?
This has to change.
Using outdated technology is simply not going to cut it when we run into the zerg in a few hundred years.
Yes but look how outdated technology saved the Galactica!
Yes, well, having watched various technology forums (especially Android phone forums, as of late) I’m amused at the people griping at software bugs. Yeah, we have our nice shiny fancy equipment, but there are always software flaws. That’s where the “Do not use in mission-critical applications” line in EULAs comes from.
Meanwhile, when Voyager 2 started returning gibberish to Earth, some guy concluded an alien intelligence was responsible (probably a stray cosmic ray that flipped a bit in the wrong place)
http://www.digitaljournal.com/article/291951
I won’t claim that was a REASONABLE conclusion, but you rarely see such claims being made about our own personal computers.
I am pretty sure that they are still using Amigas!
See: http://www.upchug.com/HalInterview-eng.html
I thought that my life as a programmer was over but I can see that I can make a comeback!!! Fortran 77, Pascal, Cobol, ….ancient languages are still used. My GOD!