This site calls itself ‘the biggest free abandonware downloads collection in the universe’. No idea if that’s true or not, but all I can say is that I spent a lot – a lot – of time today browsing through the incredibly extensive collection of old operating systems. From an alpha release of Windows 1.0 to NEXTSTEP, this site has it all.
Great for emulators.
Gives me an excuse to write a floppy and test this old kaypro II sitting in the corner
Yeah, neat stuff.
Before you do, given the conversation below you…
http://youtu.be/up863eQKGUI
I’m surprised to see even software from the GDR, such as DCP (a DOS clone). Even though DCP and SCP (a CP/M clone) were “quite standard” and not very “special” even from a today’s standpoint, it’s nice so see them in a non-german collection. Still I’m missing the more “exotic” operating systems such as MUTOS, WEGA or OS/ES. It’s probably nearly impossible to get those out of the original media and into a form fitting the Internet age… ๐
For those operating systems that were distributed in source form (or came with sources), such an archive would be even more interesting!
But just imagine what could be possible: Running OS/ES with Hercules, or SVP on SimH, in emulated environments. And I assume the 8-bit and 16-bit “PC-suitable” operating systems such as SCP, DCP, versions of MUTOS, maybe even SIOS can probably be run on PC virtualisation systems such as VirtualBox, Bochs, or Vmware.
Of course I downloaded DCP so I can write it to 5.25″ floppies (yes, I actually can do this) and try it on my Robotron EC1834 and A7150 (CM1910) if it should still work. If not, I still have “sufficiently slow” western PC hardware which should also be able to run it. That’s a nice occassion for some time travel. ๐
Where would you even FIND the vast majority of that stuff to archive it? Just on the first page there’s Seattle System 86-DOS, Janus for the Amiga (Bridgeboard), BeOS R4…
Has anyone even verified that any of this stuff is legit? I have my doubts.
Legal? Sketchy, but a lot of these OSes, the copyright holder is no longer around (e.g. Be Inc) or likely to care (Win 1.0).
Oh I’m not even questioning the legality of most of it: they’re hosting stuff from Microsoft, for a start. A lot of it is obvious copyright infringement. I’m just wondering if most of it is actually what they claim it is. When did you last see a copy of Seattle DOS-86 and when did you think “Hey I should upload that to a website no one has heard of before!”?
If it turns out to be legit then that’s amazing, and awesome, but I’d mirror it as quickly as possible if you want a copy. I’d still want to know where they sourced a lot of this stuff, though, because that would be even more interesting!
I’ve been using it for ages. My DOS disks came from there
I was happy to see Ultimate DOOM 1.9 on there. My sister bought that for me for my birthday back in 1995. The original floppies are unreadable today (possibly a quirk of my USB floppy drive), and I had a copy of them in .img format but it mysteriously disappeared from my backups long ago. This site just made it possible for me to get them back. Woohoo!!
The copyright holder may not be around, but the patents for a lot of this software was sold. It’s obviously a legal grey area and the fact that you’re supposed to register to download is also not cool.
I’m glad someone is archiving this stuff, but I think it could be handled a lot better.
Is all this stuff legal to download?
I see MS-DOS 6.22, Breadbox, and some database products that make me think these downloads are not all legal.
I would not use this site without some sort of verification of its legality.
No.
Is anyone likely to care? No.
I don’t know if they’re on here, but there are mouse/network/scsi/video drivers available to run OpenSTEP in VMware.
The drivers also work for early x86 builds of Rhapsody
This page has a link (And instructions)
http://www.zebpedersen.co.uk/?p=1118
I used to pillage this site all the time. My account still works too ^_^
Oh that brings back memories, at one of the shops i worked at we ALWAYS kept 98Lite and Norton Utilities along with Roxio GoBack as our 3 Win98 “must have” tools…sigh, the days when Norton was good, when hotrodding the OS was considered normal.
Although I have to say while I may wax nostalgic about stripping OSes for speed I don’t think I’d like to go back, having 6 cores and oodles of RAM is just too nice. Its just amazing to see how far we have come, when my $100 GPU card has more memory and processor speed than my first 7 computers put together!
It just blows my mind that a $20 MP3 player has several times the power that my old VIC 20 had when my VIC with all the add ons probably cost me $500…what would that be today, a couple grand?
At the same time, I marvel sort of in disgust when we used to be able to run NeXTSTEP on a 25MHz machine with 20MB of RAM and a 200MB disk, and yet we find it difficult to squeeze an OS on to a 700MHz Raspberry Pi w/128MB of RAM.
Yeah, right. Of course NeXTStep and old 25MHz hardware could handle complex 3D rendering, multiple CPU cores, gigabytes of memory, hundreds of system services and thousands of threads running at the same time, multiple 4K displays, terabytes of data, real time high resolution multimedia processing, user interfaces even with icons more detailed than resolution of old machine, etc.
Edited 2014-06-07 17:30 UTC
For their time, yes they could. John Carmack wrote Doom and Wolfenstein 3D on a NeXT Cube, and they were by far the most advanced games of their time.
Wait, you’re seriously faulting an OS made in the early 90s for not being able to handle tech that has only been around for the past 10 years or so? What are you smoking?
They used NeXT to do some development/level prototyping work. But saying those 2 games were done on NeXT exclusively is pushing it IMO.
But honestly, for most intents and purposes a 25MHz NextStation with 20MB of RAM was VARELY useable. People put up with the lack of performance because the development environment NeXT was offering was that good (ironically, that situation reversed with OS X).
IMO when some people look back, they do so with very selective memories. E.g. there is a reason why NeXT had to get out of the HW business, their systems sucked performance wise.
“We wrote all of DOOM and Quake’s code on NeXTSTEP. We debugged the code in NeXTSTEP with DOOM and Quake’s 320×200 VGA screen drawing in a little Interceptor window while the rest of the screen was used for debugging code. When all the code ran without bugs we cross-compiled it for the Intel processor on NeXTSTEP then turned over to our Intel DOS computers, copied the EXE and just ran the game. The DOS4GW DOS-Extender loaded up and the game ran. It was that easy.“[1]
It seems you’re the one with the selective memory in this case.
[1] http://goo.gl/TQTwbO
(EDIT: fixed broken long link with shortened URL)
Edited 2014-06-08 12:39 UTC
This is one of the most interesting OSNews comments EVER! Cheers!
Absolutely, my memory is indeed selective/incomplete in this case. So I stand corrected. BTW, the sound engine was done on Amiga and the actual optimization of the engine was done using Intel/Watcom compilers on DOS. So I guess I will have to disagree with the definition of “entirely developed on,” to each their own I guess.
In any case, the DOOM version for NeXT is feature incomplete and ran like molasses on said 68K NeXT HW. Which was the point I was trying to make: NeXTStep as a development environment was fantastic (for the time), but the HW it originally ran on was pants (performance wise).
It also ran like molasses on 68k Macs of that era; my girlfriend in high school had one and watching her play it was like watching a slideshow. My point being, you didn’t spend $11,000 on a NeXT machine to play games, they were called “workstations” for a good reason.
Right, but even as workstations, performing any complex work on them was painful. Stuff that we take for granted now was far from trivial then.
In any case, the point I’m trying to convey is that things have gotten faster not slower. And although they share a lot of pedigree, comparing an old version of NeXTStep with modern OSX, for example, is not quite proper… because a modern OS/System is doing orders of magnitude more stuff than those old NS/68K systems.
Wait, when did I ever do that? I compared it to a 68k Mac from the same era, not a modern OS X system. Now you’re just putting words in my mouth, which is just about the scummiest way to “win” a discussion.
Calm down sparkie, where exactly was I implying you had said that? I was simply referring the post (which wasn’t yours I believe) which sparked this subthread. Don’t let projection get the best of you…
Fair enough, but you were actually the only person to mention OS X (twice) in the thread. I think what started it was someone comparing the NeXT Cube to a Raspberry Pi, then someone else waxed satirical about the NeXT Cube being able to do 3D rendering, which is when I poked my nose in to affirm that it could. It was all downhill from there.
Sorry if I stepped on any toes, this just happens to be an area I’ve had a ton of interest in over the years.
I wasn’t complaining that you need gigabytes of hard disk space to fit whole OS.
Crack, sometimes weed. The best are in Liberty City. Sometimes I’m gun for hire.
Pffffftt.
Back in my day…..
PC’s had 4MB of RAM and maybe a whopping 40MB of hard disk. That was good enough for Wing Commander (after you’ve spent a day freeing up low memory).
Much better days, I think we can all agree. Am I right?
Edited 2014-06-08 05:59 UTC
You were lucky! I had a 1 Mb 8Mhz no hard drive; finally got the money to get an external 20Mb HD, was able to upgrade that to 60Mb. I miss the days of assembly language on a Motorola 68k.
You were lucky friend, I was stuck on Commodore Cassette drives for the better part of the 80s. Do you have ANY idea how fricking SLOW and unreliable those things were?
The others can talk about how much better things were “back in the day” but give me my hexacore with 3TB of space, 8GB of RAM with another GB on the GPU, thanks ever so.
Oh, I was there for the Commodore tapes too and even worse, the ZX Spectrum tapes (30 minutes to load Valhalla? Sure!).
Commodore cassette interface was slower. The C64 was so slow that a few titles had boot games – games that you could play whilst the real game was loading (pretty neat really), one I remember seeing was the on the budget re-release of Ghostbusters.
I dunno, the Speccy felt a lot slower. I can’t remember anything on the C64 that was as slow to load as Valhalla on the Speccy.
Maybe it’s because most C64 games had turbo loaders.
Dude the C64 was luxury, where I lived it was VIC all the way down and if you think Speccy and C64 were slow you ain’t seen nothing until you loaded a game via cassette on the VIC. I remember I used to get a game started loading, go have lunch, and when I came back if I was REALLY lucky my game would be ready to play ๐
Kids today don’t know how good they have it, when i hear somebody whine because their quad core takes a whole 3 minutes to load Windows 7 oh how I want to force them to type in a program on a VIC or let them experience the “pleasure” of firing up win95 on a 486 with 4Mb of RAM.
Oh, I’ve used the VIC-20 too. Bomber Run FTW.
Of the games I can recall right now (Gridrunner, Jumpman, Bomber Run) none took nearly as long as Valhalla on the Speccy though.
bassbeast,
While I can appreciate the sentiment, I think this timing is exaggerated (where did you get 3 minutes from for a modern computer?). It seems more like for all the drastic performance bumps we’ve had over the years, they’ve gone lockstep with overhead, the end result is surprisingly similar startup times.
win95
source: https://www.youtube.com/watch?v=Z3ApDgL7DFY
bios – 7s
windows – 41s
win7
source: my dual core i3 2.4ghz laptop
bios – 7s
windows – 41s (no not a mistake)
win3.1
source: https://www.youtube.com/watch?v=hSJDIGiepgU
bios – 28s
windows – 39s
win3.1 in virtual pc
source: https://www.youtube.com/watch?v=CWh3k0ChdEY
bios – 3s
windows – 9s
win7
source: https://www.youtube.com/watch?v=-sA0zmhW7Fg
bios – n/a
windows + ssd – 15s
windows + hdd (wd black) – 29s
Of course bootup is held back by disk seek times, but even with the SSD (the patriot is speced at 275MB/s), it’s had time to load 4GB in those 15s. It seems there are still ghosts in the bootup sequence causing unnecessary delays.
Dude you never worked in a PC shop, have you? Trust me, by the time they load the machine up to the gills with start up crap, everything from java updater to Bing Desktop to Ask toolbars? if it boots in under 5 minutes its a fricking miracle.
And I bet your PC is a DIY yes? Or you at least made sure your PC got a 7200RPM drive, right? Again sorry you missed the memo but a LOT of the OEM PCs made in the last…ohh I’d say 6-8 years? 5400RPM drives which also slows boot pretty considerably. They do this for TWO reasons, one is heat, the slower the drive the lower the heat which lets them get away with shittier cooling. The second is noise, many of the OEMs like to brag about how “silent” their PCs are and with the thin metal and plastic they use a 7200RPM whine is pretty easy to hear whereas the 5400RPMs? Not so much.
Anyway just to make sure I wasn’t stuck in a time bubble I broke out the stopwatch and timed this laptop that just landed in the shop. Its pretty typical,G62-435DX if you want to know specific make and model, and its running the usual HP crap along with about 2 year’s worth of user crap. From the time I flip the switch until the drive stops thrashing? Its 3:48 Which is actually pretty average for a PC, between the trialware, the HP crap, and the user crap? That’s a LOT of crap which has to be loaded at startup. Of course it keeps guys like me in business so who am I to complain?
bassbeast,
Actually I have, not that it matters.
Sarcasm noted, however the laptop I bench-marked above is quite low spec compared to today’s i5 or i7 processors, and it’s only a 5400RPM drive as well.
How long before it goes to an interactive desktop? To the extent that the computer is already usable, I would not include the processes that may result in continued disk activity in the background (ie software updates, AV, etc).
3+ minutes doesn’t seem normal to me. I wouldn’t dismiss your evidence, maybe it is more common than I realize, but in a PC repair shop aren’t you more likely to see the worst computers with lots of problems and not the ones that are running well? What is the time after you remove the junk for your customers?
Anyone else care to measure their bios time and also the time after the bios till an interactive desktop?
Edited 2014-06-10 20:44 UTC
I was stuck on them for the better part of the 90s…
Before the PC
A 1Mhz(max)/1Mb RAM single core machine could support 64 or even 128 users logged in with dumb terminals doing real work, nit posting inanities to Twitter etc
Progress, what progress?
RSTS/E if you were interested running on a DEC PDP-11/70.
shotsman,
It’s not clear from context whether you mean “Personal Computer” or “IBM PC”.
I think software developers across the spectrum will tend to consume whatever resources are available since there isn’t much motivation to code efficiently when more RAM is available. A lot of skills we used to have in the industry at tweaking low level algorithms for performance and size, out of necessity, are a lost art today. I actually enjoyed those kinds of challenges, but corporations and even users mostly don’t care anymore, and consequently such skills are under severe atrophy. There’s just very little demand.
Lots of people will say it doesn’t matter, but I often reflect on software solutions today thinking that the inefficiency of our software today is just astounding.
Edited 2014-06-09 15:55 UTC
Well, you have to define first what you mean by “efficiency.”
tylerdurden,
I think on the whole efficiency is poor across the board. For instance, just this past week I was trying to figure out why nsd3 (a DNS server) is using so much ram after a fresh install to serve just a couple domains, it’s outrageous that it needs 10%-20% of ram on smaller VMs just to serve DNS. It really shouldn’t take more than a few hundred kilobytes with the zone files fully cached in ram.
I don’t mean to pick on NSD, it’s just an example, I probably could have used almost anything else.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
1791 nsd 20 0 59860 54m 27m S 0.0 11.1 0:00.25 nsd
2119 mysql 20 0 314m 46m 6656 S 0.0 9.5 1:01.62 mysqld
1874 nsd 20 0 66520 27m 264 S 0.0 5.6 0:00.00 nsd
1876 nsd 20 0 59860 27m 188 S 0.0 5.6 0:03.42 nsd
12277 www 20 0 152m 9068 5892 S 0.0 1.8 0:00.04 php5-fpm
9574 www 20 0 153m 5012 2664 S 0.0 1.0 0:01.38 php5-fpm
10036 root 20 0 10092 3276 2592 S 0.0 0.7 0:01.56 sshd
I assume you want to distinguish between human efficiency and code efficiency… If so then I’d have to concede the point; if something might take a hundred hours to optimize, then throwing more hardware (ie another gig of ram) may be “more efficient” than optimizing the code. However we also have to look at the asymmetry of scale between users and developers. I’d expect the aggregate costs of thousands or millions of users upgrading their systems to accommodate less efficient code is likely much greater than the cost to make said code more efficient. Yet to developers, these are “external costs”, which arguably aren’t their problem. Few developers are being paid to fix it (we don’t get a cut of the money that users saved by not having to upgrade hardware).
No, I simply wanted you to provide a definition for “efficiency,” because you keep using that term and it’s hard to figure out what you’re referring to.
When one provides statements like “it shouldn’t be…” or “it feels like…” one is presenting highly subjective qualitative opinions. Which is fine, that’s all we do here; present opinions. But we need to understand that ultimately it may all be moot because “efficiency,” when dealing with computing, is a concept quantitative in nature.
Among friends, I wouldn’t have thought a strict definition necessary
To be honest I’m less interested in definitions and more interested in being amused at the outcomes. Here are some processes on my windows box:
Firefox – 223MB
Thunderbird email client – 130MB
Explorer – 26MB
WinVNC (idle/waiting for connection) – 8MB
spoolsv – 7MB
ppped – 6MB
putty (inactive) – 5MB
notepad – 4MB
winlogin – 3MB
Now take a look at the minimum requirements for win95:
Now granted we can say that some additional memory is needed by real improvements, higher resolution bitmaps, more features, etc. But I still find it hilarious that even simple software today needs more ram just to sit in the background today than the entirety of the win95 operating system needed two decades ago.
Edited 2014-06-09 23:19 UTC
Yeah, good luck getting anything done with a 386 + 4MB of RAM running Win 95 ;-).
In any case when windows 95 came out I was relatively young, but I remember some older folks complaining about how wasteful it was having to use 8MB of RAM to just “move windows around.” Because back in their day, all you needed was a couple of K of memory to do whatever it is they were doing back when punched cards ruled the world…
tylerdurden,
Haha, I wouldn’t know, we had a Pentium, actually ๐
You may be right, Win95 had a lot of new multimedia stuff, as I recall it came on a very large stack of floppies compared to DOS+windows 3.11, it was also the first available on cdrom I think. This makes the situation all the more ironic; as bloated as win95 software was, it managed to pack a whole lot more functionality in that 4/8MB of ram than a modern program does.
Again, it depends on what you mean by “functionality.”
And I’m afraid you could be comparing apples to oranges, following your logic; DOS packed more functionality on 640K than Word for Windows 95 did with 8 MB (which was utterly unusable with that amount of memory). A comparison that makes little sense because one is an Operating System, and the other is an Application.
tylerdurden,
You are making this too difficult! I know that I was comparing apples to oranges. Of course putty and vnc are absolutely not the same thing as an OS.
I would love to get you the ram usage of similar applications back from the windows 95 era, but I don’t have a win95 machine nor the win95 software to measure.
Never the less I think the case for bloat is very well illustrated when such extremely basic applications take up more resources sitting idle than entire computers had back in the past. Software engineers just don’t care about how efficient their software is anymore because hardware makes up for it.
It may not necessarily be a problem, but there should be no doubt that in the past developers did more with less because they had to.
I’m not really making things difficult, as much as I’m just trying to illustrate the fallacy presented. ๐
tylerdurden,
What I keep trying to illustrate since the first post, is that a lot of today’s software, had it been written for older systems such as windows 95 would have had to be written more efficiently out of necessity due to the significantly limited resources of the time. Even without having the actual numbers from a win95 machine, a process occupying several megabytes of ram (more than physically available) just to wait for a socket connection would have been ridiculous. You agree with this right? Yet today it’s practically the norm for trivial processes to require this much (citing my earlier posts).
Edited 2014-06-10 19:08 UTC
We had some thing like that at school an unix based server with a couple of terminals and when the class wanted to compile a simple pascal program it took forever.
Later I got an Amiga 500 512kB ram and a floppy.
This was possible because those 128 users were all running the same program (a very optimized one) and doing very few operations per minute on average. The terminals weren’t that much dumb either – at least they handled text “rendering” and some special chars by themselves.
Compare that to the Twitter website… written in a trio of poorly designed languages, running inside a behemot browser which runs inside an OS which is supposed to do a quadzillion other things at the same time – and a smartphone app is only a little bit better.
Yes twitter is ridiculous, at least 15kB data voor 140 bytes (read that somewhere)
Always wanted to try that, downloaded it, it booted under Qemu, then it asked for a serial number. Gah!
Also very curious about the AIX on x86. Alas, haven’t managed to boot it in Qemu yet.
D’oh. For reference, the serial number is in the archive, in a file called coherent/base/SERIAL.NO.
Now if it’d only work with the Qemu IDE device.
Wait there was a port of AIX to X86??? Kill it with fire!
I’ve been using that site for years.
So noted by your buddies at the N5/\.
Do you know of any such site with abandoned source code?
sourceforge/github? Most of stuff over there gets abandoned over time, I imagine