GNU/Hurd is the original Free Software operating system started in the 1980s. Its microkernel design has been evolving over the years and the project has not quite hit mainstream use. I believe this is due to one main reason: the lack of drivers for peripherals and hardware. In this talk, I explain how NetBSD kernel drivers have been reused in a microkernel setting and demonstrate their use to boot up a GNU/Hurd system via a userspace rump disk driver, with a driverless Hurd kernel, gnumach. The ACPI management, PCI management, and actual driver are in separate processes with RPC interfaces between them, which separates out their debugging, licencing concerns and execution.
Hurd is a neverending story, derailed by the massive popularity and uptake of the Linux kernel as the de facto standard kernel for the GNU project. I’d love for it to become more competitive, but the situation isn’t exactly looking great.
Minix3 took the same path though….
“GNU/Hurd is the original Free Software operating system started in the 1980s.”
I suppose we can let that statement stand if we assume that “Free” software started with the GNU Project and the GPL versus either Open Source or the age of software when everything was free by default.
It is worth noting though that the GNU Hurd kernel was started in 1990 and is still shooting for its 1.0 release. “Production ready” GNU still requires a non-HURD kernel ( like Linux ). The first GPL kernel to become production ready was Linux, released in 1991. I will let others decide when Linux was production ready.
BSD Unix was started around 1974 or so and “released” in 1977. It is hard to say where the 1.0 equivalent of BSD appeared as ( like GNU ) it was an add on for Version 6 Unix. That said 3BSD, a full OS, was available by 1979 and BSD was free of all AT&T code by 1989. It was ported to x86 ( as 386BSD ) shortly after and FreeBSD ( and to some extend Mac OS X ) are direct descendants of 386BSD.
There was never a time when software was “free by default”. In the early days of computers, software was practically non-existent. For the most part, you as an organisation had to write your own software for whatever machine you bought. Later on, there was OSes that were licensed or sold separately, but they were still not bundled by “default” with the machine. It wasn’t until the late 70’s an 80’s that OSes were “bundled” with hardware, such as AppleSoft BASIC, Microsoft BASIC etc, but despite being free to the user, this was still very much proprietary software that was either developed in house, or licensed from a 3rd party for a fee. Software has always cost money, it’s a misnomer to say otherwise
What I am referring to here specifically are the early days of AT&T UNIX.
Some people place the origins of UNIX as far back as the mid-60’s but it was certainly around by 1970 and was in use around the US and as far away as Australia by 1974. Up until that point, University professors could ask for and receive the UNIX source code for free. According to Wikipedia, the first source code license for UNIX was not sold until 1975.
Also in 1975, Ken Thompson himself installed UNIX at the University of Berkeley while on sabbatical from Bell Labs ( AT&T ). Bill Joy began enhancing the system and released a version with his changes in 1977. The “Berkeley Distribution” has been under continuous development since and is best known today as FreeBSD. It is also in Mac OS X ( via NextStep) and Bill Joy of course used BSD to create SunOS at Sun Microsystems.
AT&T agreed with your statement that software has always been under copyright and not free. After they started to charge for their own UNIX, they started to claim copyright over Berkeley installations as well. BSD ( then called Net/1 ), first released a version without any A&T code in 1989. AT&T launched a lawsuit against BSD in 1992 that was not settled until 1994. This was the SCO Group vs Linux trial of its day. Without going into the details, AT&T lost and BSD continues to be available unencumbered.
I would say that BSD has flourished but the timing of the lawsuit coincided with the appearance of Linux in 1991. By the time the status of BSD was clarified, Linux had established itself as the “free” version of UNIX.
Anyway, I am not debating the legality of your statement. That said, the fact that the legal status of software started in the 70’s was not fully decided until the 90’s tells me that the common understanding back in the day was not so cut and dry. Lots of people involved in the early history of computers that would not have seen themselves as “pirates” according to our current understanding used, passed around, and modified software of all kinds.
A key individual that would agree whole heartily with your comment is Bill Gates who wrote an “Open Letter” to computer hobbyists in 1976 accusing them of stealing and making his case for the harm being done:
http://www.blinkenlights.com/classiccmp/gateswhine.html
One of the people that disagreed with the Bill Gates world view was Jim Warren who said the following:
===============================================
There is a viable alternative to the problems raised by Bill Gates in his irate letter to computer hobbyists concerning “ripping off” software. When software is free, or so inexpensive that it’s easier to pay for it than to duplicate it, then it won’t be “stolen”.
==============================================
Jim Warren started the TinyBASIC project to create a free replacement for the Altair BASIC that Bill Gates was looking to get paid for in his letter. The TinyBASIC initiative is the origin of the term “copyleft” ( predating Richard Stallman and GNU by quite a bit ) and distribution of TinyBASIC source code in written form led to the creation of Dr. Dobbs Journal.
After the events above, the question of how “free” software was became more explicit and today we have the much better defined worlds of proprietary software, Free Software, and Open Source.
In the university space, OSes were free because students and faculty needed to research (“hack”) on them, but businesses did not. Much like an art student is more likely to use freely-license footage and images so that they can manipulate them as they see fit but most media assets seen by the general public are proprietary.
In the business space, IBM was as proprietary as it gets, and before that, you had all kinds of proprietary mainframe OSes. Sure, the OS was provided free of charge with the machine, but so does Apple and so did SGI.
I agree with The123king, the whole “lost paradise” narrative doesn’t make sense. In fact, businesses have increased their use of open-source OSes since the IBM days and the SGI, Solaris and Windows NT workstation days.
I think my comments have been misunderstood. Even my original “free by default” comment was not meant to imply superiority or a “lost paradise” but rather the simple fact that, for a time, it was common to have source code for your software and essentially to have some of the “freedoms” that the Free Software Foundation espouses. I also believe that “software costs money” was not obvious even to many people that used software.
I think the big difference was that, as others have said, software of the time tended to be for specific hardware and very often to a specific customer. I am not saying that “licenses” did not exist but it was not really the pervasive concept that it is now. Everybody understood that you had to pay teams of people to build big systems but you were paying for the people. Everybody got that hardware was expensive ( very expensive ) but if it came with software ( even source code ) that could be considered part of the hardware purchase. There was no business model around the software.
There was a period of time ( less than a decade really ) when it first became common to take software written for one machine and move it to another and where software started to be “sold” on its own. UNIX was one of the first operating systems to do this. Microsoft BASIC is another example. That is when paying for a “license” to use the software first started to matter. As evidenced by the first UNIX code license not appearing until 1975 and the Bill Gates “open letter” ( and responses ), this was all kind of new thinking for many people ( even the copyright lawyers and judges ). The term “copyleft” appeared in response to Bill Gates asking for people to pay for his software. The Free Software and Open Source movements appeared when licensing software became a thing.
Well … there was. IBM’s OS/360 and VM families were in this category. The software was just considered something that enabled hardware sales. OS, utilities and the whole shebang was provided free and with source which even generated patches from users / customers and many of those got incorporated in IBM’s software. This situation continued until around 1979 thru early versions of MVS when IBM went “OCO” that is object code only with copyrights and all else that goes with non-open software. However the older free versions still remained orderable from IBM and you could get it for the price of a tape and shipping. Then someone wrote an open source hardware emulator for IBM hardware (called Hercules) around 1999 which generated a flurry of these orders. IBM fulfilled some of those and then put an end to it. The software and source is still available from hobbyist groups with no adverse legal action from IBM. People take that as a proof of its being public domain.
Search “MVS TK4-” for one such distribution.
Still, that source code was always licensed by IBM – all rights reserved. In that sense, it was even more restrictive than -say- Microsoft’s shared-source licenses which are available formally and under clear terms. And by all means, neither was/is open-source.
And back then, the kind of large organizations that could buy a mainframe so they can request the source code were the same kind of organizations that can request Microsoft shared source today.
tl’dr: Microsoft’s shared source is what IBM was doing but formalized.
No, that’s what was different in those days despite the fact that it’s Big Bad Blue we are talking about. The software and source was not licensed, no copyright notices and no documents declaring it so. Someone buying the machine got binaries and source, the was no separate request for source. Even if you didn’t / couldn’t buy a machine you could order the tape for a nominal fee and you would get it no questions asked, no documents to sign. This continued till around 2000 when IBM stop accepting those orders, still they did not ask anyone to return the material or destroy it or served any notice about what they could or could not do with it. Since the hardware emulator became available (~1999) folks have been tweaking, bug fixing, enhancing and sharing this software.
To further clarify the situation I’ll mention a few things that changed after 1979.
* Firstly source became unavailable to everyone and I haven’t heard about ANY company obtaining it even under NDA. Some software vendors do get access to non-public API under NDA but that’s about it.
* Even the binaries have copyright notices.
* You can’t just order it anymore.
* After around 2000 you can’t even run it on just any hardware. Some folks tried to offer commercially supported emulator (Turbo Hercules) as an alternate platform to run z/OS, IBM promptly shut them down.
There are however some intriguing exceptions in all this – for example a component of z/OS named JES2 still ships with source and not because anyone asks for it, it just does. However it’s plastered with copyright notices.
Moreover it’s not that IBM suddenly realized all this in 1979 since they had licensed software even before that date. One example is CICS (an application server) which came out around 1971 and was licensed from day one, source was not available except to some third party vendors who customized and supported it.
I don’t think Hurd would have done any better if Linux had never been started. We would probably be using BSDs instead.
Or Minix ?
Luke McCarthy,
It’s an interesting question. We tend to put so much importance on the winners, but someone would still have won even if today’s winners had lost. Take away linux and it would just be someone else. We can say this about microsoft, linux, apple, google, etc. Whether by luck, connections, good timing, differentiation, etc, they managed to pull ahead and become the dominant forces in their respective industries. I don’t think we can make the case that linux truly differentiated itself because Linus literally set out to be a unix clone, but his timing was excellent and the lawsuit against BSD went till 1994 and by then linux was already more popular. GNU hurd came about too late and timing was everything. Breaking into an existing market requires a ton of resources, and GNU doesn’t have that. Hurd’s introduction 4 years earlier may have potentially made a big difference, but that’s not what happened.
So if it wasn’t linux, my guess is that GNU hurd would still have ended up behind BSD all else being equal. Sometimes I think we would have been better off with BSD over linux, but it didn’t happen.
Sorry, I had not seen your comment when I left mine. Yours is excellent. It was such an interesting time in computers and the timing of the BSD lawsuit may have mattered a lot. In a different world, even WindowsNT ( released in 1993 ) could have been BSD based. There were already rumours Microsoft used the BSD networking code. Who knows how history may have gone.
tanishaj,
OS/2 was actually in the cards at one point. But microsoft’s success with windows lead to the association dissolving.
I feel that alternative operating systems become much less competitive once OS bundling comes into play. Microsoft was forcing vendors to pay for window licenses even for non-windows computers and it totally gutted the business model for alternatives. Even with free operating systems, consumers were forced to pay full price. Microsoft would later be found guilty of antitrust abuses, but there was little consequence.
I do wonder about what might be different today if microsoft had been prohibited from interfering with markets from the beginning. Unix was more powerful than DOS ever was back in the day and IMHO the main advantage microsoft had in the early years was bundling. Microsoft thrived despite being technologically behind most of its competition.
I don’t know if you’re a developer, but they did a rough job of adding BSD socket support to windows. On unix the idea was that file descriptors should be unified: pipes, sockets, files, consoles, tty, etc. That’s pretty fundamental and the result is that you can use the same primitives across everything. Whereas on windows they added sockets as their own entities that are incompatible with everything else in win32 land. For example, if an application needs to wait on a window/console event AND a socket event, well that’s unfortunate because the win32 functions don’t support sockets and conversely the unix select functions doesn’t support win32 objects. It makes simple tasks like waiting on IO more difficult.
https://docs.microsoft.com/en-us/windows/win32/sync/wait-functions
I think these shortcomings stem from someone at MS making the decision to support sockets as a DLL instead of having a proper kernel API. The result is that the winsock API feels out of place for both unix and win32 developers.
They added more microsoft-isms with winsock2, like overlapped IO, but any code that uses it will be even less portable (maybe this was intentional).
I am glad Microsoft won (no matter what means they used), considering OS/2 was basically a more retarded alternative to Windows (it had a UI that hung even if a single application misbehaved due to their synchronous input queue, it had installation problems even on some of IBM’s own systems, and couldn’t print for shit). It was also horribly expensive because IBM wanted to impose an “OS/2 tax” on PC clones so that PC clone prices were brought in line with IBM own prices, and then there was the whole “Extended Edition” version that was exclusive to IBM computers.
Let’s sigh in relief OS/2 bit the dust, the same sigh of relief reserved for Windows Phone’s failure (you know, that mobile OS that was butt-ugly even by “flat design” standards and managed to efficiently combine the worst trait of Android, lack of OS upgrades, and the worst trait of iOS, lack of sideloading).
OS/2 and Windows Phone are major examples of the free market taking out the rubbish. Stop reminiscing them
kurkosdr,
I disagree, OS/2 and windows were very closely related and OS/2 wasn’t worse than windows. People may not remember how flaky win3.1 and even win95 were.
This clip is funny because it happened to Bill Gates on stage, but the reality was that this wasn’t uncommon with windows – everyone was familiar with windows crashes.
http://www.youtube.com/watch?v=IW7Rqwwth84
IBM might have tried to make ownership more expensive, but then commodore, atari, and others could have filled the gaps and the industry might have ended up being more competitive, Things can change quickly due to the butterfly effect.
Did you or anyone you know actually use a version/variant of Unix (say 386BSD) on an actual 386 system? I was wondering of the speed compared to the MS-DOS of the day. My only Unix experience back then was on big boxes so I have nothing to compare. I believe you are correct that Unix had far more features & functionality, but back then, everyone was walking the line between features and speed.
@alfman
OS/2 has that synchronous input queue that froze the entire UI if a single app stopped responding. Sure, the OS underneath was technically still running, but you couldn’t do anything with it. Other problems with OS/2 were difficult installation and printing issues. All non-workstation OSes of the 90s sucked from a stability perspective, no surprise considering the memory they had to run on.
I know OS/2 might be better from a technical standpoint, but for most people it was “that more retarded alternative to Windows that IBM forces you to pay a ton of money for”.
And what makes you think Commodore and Atari wouldn’t become something like Apple eventually, taking advantage of their hardware-OS coupling to command high prices? They were some of the most evil companies out there.
kurkosdr,
What version are you referring to? Remember that win3.1 was non-preemptive and required processes to cooperate in order to not freeze other processes and the GUI.
I don’t think I asserted that. What I really think is that making predictions in a hypothetical past where things are different is just as hard as making predictions about the future…there’s no crystal ball for either. That said though more competition at all market segments is beneficial for consumers assuming it can be sustained.
The microsoft monopoly was very harmful, but under the absence of effective monopoly regulation, one might argue that monopolies are inevitable and that periods of healthy completion don’t last for very long before becoming reduced to a couple players controlling the entire market. Nevertheless, even if we assume that monopolies are inevitable in the US, I don’t think we can prove that microsoft necessarily became a “better monopoly” than atari, commodore, etc would have been hypothetically…we don’t really know and microsoft was ruthless.
An interesting alternate history to be sure.
BSD was more advanced than Linux when it first appeared in 1991. If not for the AT&T lawsuit against BSD in 1992, it is a real question if Linux would have become so popular. 386BSD ( now FreeBSD ) could easily have taken over the world much like Linux has done.
Even if BSD had been a runaway success in place of Linux, it is hard to know what would have happened with GNU/Hurd. You can make a good argument that the reason that Hurd has languished so long is because GNU did not really need it. Linux is GPL licensed and so, while “not invented here”, it is perfectly compatible with The Free Software Foundation goals ( ok, while maybe not “perfectly” as Linux is ok with binary blobs ). Richard Stallman could have put some of the energy he put into promoting GNU/Linux branding into finishing Hurd instead. It is hard to imagine that the Free Software Foundation would have been so relaxed about letting GNU exist exclusively as a userland on top of a BSD licensed kernel that evil corporations could use and extend without sharing their changes. The drive to finish GNU would have been much greater given that FreeBSD is a complete OS without need of the GNU userland at all.
So, without Linux, I think what we would have seen was stronger competition between GNU, BSD, and proprietary UNIX. If BSD would have won, would we even remember what GNU was? If BSD had won over System V sooner, might the proprietary UNIX vendors have re-based to BSD and offered value added extensions for various niches? If that had happened, would they still be around instead of companies like Red Hat that formed around Linux? If GNU had delivered a kernel, would the pure GNU ecosystem have dominated like the GNU/Linux system has today and pushed BSD aside? The legal claims against BSD were decided way back in 1992, while the legal status of Linux was still being fought in court until 2010. With a clearer legal status, could BSD have made greater inroads against other operating systems? It is fun to think about.
You are missing the timelines and a lot of other stuff when you blame the failure of BSD to become more popular on that AT&T lawsuit.
The release 0.0 of 386BSD was in 1992 (a year after Linux), and where missing a truckload of features. Barely functional. Even still, release after release, it was success. The 0.1 release had 250,000 downloads, a staggering number by 1992 standards.
It took another year to reach 1.0 in 1993… and then died.
By the 1.0 release, the x86 BSD community where engaged in a full scale war against themselves, with early versions of 386BSD only usable with “patchkits” whose maintainers didn’t got along with 386BSD developers at all. It’s was clear that despite their technical prowess, the people behind 386BSD lacked project management skills. And the community fragmented.
Even with new FreeBSD project rushing a “1.0” by the end of 1993 followed by the new NetBSD community doing the same in early 1994, it took time for these communities to organize themselves, establish clear direction and management beyond mere patchkits for 386BSD. And with that mess, lots of developers just migrated to Linux.
In a 1993 interview with Meta magazine, Linus Torvalds himself said “If 386BSD had been available when I started on Linux, Linux would probably never have happened.”.
The thing is: GNU and Linux projects where more organized than the early x86 BSD community. A lot more organized.
IMO it was a fascinating evolutionary process that led to Linux in terms of software distribution, development, licensing, and management.
I view Torvalds as the love child of Joy (BSD was focused on technology development over management) and Stallman (GNU, started as a management exercise of technology development). Torvalds is both, technically brilliant and a successful manager.
BSD shot itself in the foot with RiH (Reinvented Here) coming up with multiple solutions to the same problem they had solved. GNU did the same with NiH (Not invented Here) by ignoring other people’s solutions to the problem they were trying to solve.
Had BSD remained well managed and unified, they could have focused on bulletproofing themselves from AT&T. Had GNU just taken what they needed from Mach/BSD, they could have had a working kernel/driver/socket/userland complete OS system before Linux was even started.
Alas…
True, people moved to Linux in the early days of GNU not because it was superior, but because it worked.
The kind of issues the Hurd had where not trivial to solve, the guys behind GNU project grossly underestimated the difficult of several issues that pure micro-kernel architectures had and whose “solutions” existed purely in the theoretical realm yet at 1990.
By the early 21 century while Linux where already burning the proprietary UNIX stronghold down, Hurd was still struggling to properly boot a PC.
A world without Linux would most likely resulted in the waning down of free software movement to a minute of it’s current size. I don’t believe that BSDs would have succeeded because it’s too easy for corporations to simple freeload it’s code on their products and don’t give anything back.
We would instead have a vigorous market of proprietary UNIX systems, a web dominated by Windows server probably and… Plan9 may have survived?
CapEnt,
Indeed, that’s the argument for GPL. But there are many days I kind of feel the corporations that use linux are not giving anything back anyways. Linux succeeds largely because it has a massive community constantly working to support it in spite of the manufacturers not giving back. I think BSD could have succeeded given similar resources. The problem is that developers including myself lean towards the better supported platforms, creating a positive feedback loop that keeps shrinking the long tail and further compounding their lack of support.
I do wish plan9 had worked out. It was a new take on what unix should have been given the benefit of hindsight. A lot of the shortcomings of POSIX were identified and addressed by the plan9 team. Ultimately they didn’t have any influence though.
In a surprise move NetBSD drivers are added to GNU/Hurd and relicensed under GPLv3. The NetBSD folks were “hurd” saying, “We were wrong in our licensing.”
No way they would do that. Freetard have standards.
There is a certain irony here that the “GNU” Hurd wants to finish their operating system by importing BSD drivers. They can entertain using these drivers because BSD is “permissively licensed” ( vs “copyleft licensed” ) as per the presentation in the article.
BSD cannot reuse Hurd code in the same way as the GPL prevents it ( see Linux and the drama around trying to use the ZFS file system from BSD ). Luckily, the NetBSD licensing gives the FSF the “freedom” to use these drivers if they want.
The philosophical optics are a little dubious here. There is already lots of precedence of course. After all, for many years one of the most critical elements of any “GNU”/Linux system was the MIT licensed X Window System ( GUI ) and its many drivers. There are many other examples.