Linked by Thom Holwerda on Sat 15th May 2010 08:49 UTC, submitted by kragil
Amiga & AROS A few weeks ago, Novell and Red Hat jointly fended off a patent infringement suit thrown their way by a patent troll. The patent in question more or less came down to the concept of virtual desktops - and thanks to Groklaw, several people helped in finding cases of prior art. The most interesting one of all? A carefully restored and working Amiga 1000 demonstrated to the judge and jury.
Thread beginning with comment 424672
To view parent comment, click here.
To read all comments associated with this story, please click here.
henrikmk
Member since:
2005-07-10

I had to play around with my system files, and to do some hardcore troubleshooting all the time to do the most trivial things. What you describe is some kind of ideal situation that never held true in practice.


The thing that kind of "ruined" the classic Amiga design was the necessity of slapping a new graphics system on top of the existing graphics libraries, what we called retargetable graphics (RTG), and there were several different systems for this and it was a mess. A similar situation occurred with audio and the quick development of CD-ROMs and harddisks.

Had Commodore managed to do RTG themselves for the OS3.2/4.0 plans back then, then only the CPU transition to PPC would have been problematic. Dealing with RTG could indeed be troublesome, but it was at least fixable by hand. I wouldn't dare to do the same on a Linux system.

As for the remainder of the system: If there were program problems, fire up SnoopDOS. It was a simple program, capable of probing what the system was doing, so you very quickly could determine, what a problem was, when a program wouldn't start and did not provide any error feedback.

And yes, you could then indeed copy new libraries or drivers into your library directory by hand for the program then to work, all without rebooting or requiring special maintenance/sanity-check/packager tools.

It just honestly checked for the existence of the files and didn't have to speak to Mr. Registry or update some system database repository to reflect the system state using recovery consoles or emergency tools. It didn't need to sync anything or check anything for corruption on a higher than file system level. It just took the files it could see and ran them. If it couldn't load a file, there was a failure, usually detectable by SnoopDOS. Simple.

The third party tools that existed were mostly for disk damage recovery.

You could in fact build your own special AmigaOS disks from scratch for special purposes in a couple of hours, just using the CLI or Workbench or a third party file manager like Directory Opus and then build your own startup sequence to see if you could optimize boot time. I did this a lot.

That, I have yet to see in any other modern operating system.

This is because AmigaOS3.x and below is really simple and to learn every corner of it, takes a few days, from a modern perspective.

Of course if you were to build a new OS, you would not directly model it on AmigaOS, because it's an insecure single user operating system. But some of its principles of simplicity is a fantastic inspiration for systems that are yet to be seen. We really need them to show up.

Reply Parent Score: 5

viton Member since:
2005-08-09

Multi-user OS is a kind of overkill. Big-white-box PCs are moving out of fashion. People prefers personal(single user) laptop/netbook nowadays

Reply Parent Score: 2

Downix Member since:
2007-08-21


Had Commodore managed to do RTG themselves for the OS3.2/4.0 plans back then, then only the CPU transition to PPC would have been problematic. Dealing with RTG could indeed be troublesome, but it was at least fixable by hand. I wouldn't dare to do the same on a Linux system.

Except of course, Commodore was not going to PowerPC at all. They were going to PA-RISC instead, and it was the collapse of Commodore which led HP to instead take their next-gen PA-RISC technology and bring it to Intel, which turned it into Itanium. Unlike Intel, which viewed the technology as a way into the big-iron market, Commodore was viewing it as a way to recapture the glory they had with the 6502, a solid 32/64-bit cheap CPU that can run on anything, much the same as ARM became.

While the "Going PPC" route did come to be, it was not what Commodore had planned, not even on the radar. Where they were going was far more interesting than that.

Reply Parent Score: 2

viton Member since:
2005-08-09

it was the collapse of Commodore which led HP to instead take their next-gen PA-RISC technology and bring it to Intel, which turned it into Itanium.


You're going too wild here.
I doubt C= can be somehow related to PA-RISC fate.
Itanium is not PA-RISC next-gen technology. Not even close.

Reply Parent Score: 2