I’ll never grow tired of reading about the crazy tricks the Windows 95 development team employed to make the user experience as seamless as they could given the constraints they were dealing with. During the 16bit Windows days, application installers could replace system components with newer versions if such was necessary. Installers were supposed to do a version check, but many of them didn’t follow this guidance. When moving to Windows 95, this meant installers ended up replacing Windows 95 system components with Windows 3.x versions, which wasn’t exactly a goods thing.
So, they came up with a solution.
Windows 95 worked around this by keeping a backup copy of commonly-overwritten files in a hidden C:\Windows\SYSBCKUP directory. Whenever an installer finished, Windows went and checked whether any of these commonly-overwritten files had indeed been overwritten. If so, and the replacement has a higher version number than the one in the SYSBCKUP directory, then the replacement was copied into the SYSBCKUP directory for safekeeping. Conversely, if the replacement has a lower version number than the one in the SYSBCKUP directory, then the copy from SYSBCKUP was copied on top of the rogue replacement.
↫ Raymond Chen
All of this happened entirely silently, and neither the installers nor the user had any idea this was happening. The Windows 95 team tried other solutions, like just making it impossible to replace system components with older versions entirely, but that caused many installers to break. Some installers apparently even went rogue and would create a batch file that would replace the system components upon a reboot, before Windows 95 could perform its silent fixes. Wild.
I used Windows 95 extensively, and had no idea this was a thing.

What Windows 95 managed without any actual kernel protections (if DOS drivers were installed), no locked file system, no user accounts, and no discipline from application developers is nothing short of extraordinary.
They were good engineers.
Dealing with real limitations, right? I sometimes dream of a parallel world where we get all the powerful hardware and the memory we have now, but still write the software with the care we did in the past.
Imagine if a new game would be coded with the care of a game that goes to the ROM cartridge and failure can´t be easily fixed? Or installing an operating system that is coded with mission-criticality as key design goal? That you could trust an airplane to Windows.
What I am tired is of, in a few cases, being a paid beta tester. And my time is worth nothing. 20+ years of sysadmin life and I am truly tired of having to redo shit just because the vendor changes some very basic features for no clear benefit.
Shiunbird,
To be fair, there is still software written with extreme care. But we don’t see those often in daily life
The stuff that goes into cars, traffic control systems, NASA rovers are still great. At. least most of them. Some of the debugging and upgrade stories of remote robotic rovers in deep space with an extremely limited link are legendary.
For the desktop?
Quick and easy wins
Why would someone spend months polishing the software, run private beta tests, fix all regressions, and reduce bugs as humanly as possible?
An Electron App requires too much RAM to run a simple calculator? All laptops should have 16GB anyway!
When they can just rush it to the market, and use the and users as beta testers, they stop caring.
Which is not great.
Requiring 16GB of RAM and multiple cores of multi-ghz to have an OS and browser to watch youtube, checking email and browsing the web is the efficiency-equivalent of a car that is so heavy and bloated that it can’t move unless it is powered by a 6L V8 engine. =( Like it was the case in the 1920s.
Shiunbird,
At one point engineers misunderstood the assignment.
The RAM should not stay idle. The user paid for it, so it should be used 100%
However, a “cache” without proper eviction policy is another name for a “memory leak”. Of course if I have 16GB and one tab open, it should keep as much details in memory, in case I click “back”. But it should not hold onto it, just a “weak” reference only.
Shiunbird,
It is very wasteful, but “luckily” the outstanding progress in hardware has managed to mostly cancel out the bloat in software.
1980s hardware with a 10mhz 286 and 1-7mb ram used a 100watt power supply…
https://www.ardent-tool.com/qtechinfo/GJAN-43VM6G.html
Modern hardware has gotten better by so many orders of magnitude and can typically do it within 300watts (monster GPUs aside). Mobile processors use even less while still having pretty good specs.
It does bug me when a trivial application requires 100X more resources than it really should need, but everyone ignores it and moves on because the hardware we have makes it possible and people just take it for granted. As a developer I still appreciate the art of software optimization, but it’s quite evident that I am a dinosaur in this regard.
sukru,
I don’t know if any hardware or ram standards actually supports this, but in principal it could be beneficial to not use 100% of ram and allow it to be powered down. Although I’m not able to find much about anyone actually doing this….
https://www.embedded.uni-tuebingen.de/assets/publications/Kuhn_et_al-Hardware-assisted_Power_Managment_MCUs.pdf
DRAM apparently has a power down mode for the controller, but it continues to do DRAM refresh and ECC correction…
https://docs.amd.com/r/en-US/pg456-integrated-mc/DRAM-Power-Down-and-Self-Refresh
I don’t know if it’s possible to disable refresh of a bank, but obviously if a memory bank isn’t in use then there’s no reason to spend power refreshing it. Letting the OS power banks on or off on demand could reduce power consumption in loads when all the ram isn’t needed. Often times we provision ram for the worse case scenarios, even if we don’t need it the majority of the time.
For example…
Alfman,
That is a neat idea. But I’m not sure any modern operating system has support for it.
That is exactly why RAM should be heavily used for caching. An ideal operating system will provide the necessary API and plumbing to make sure no single page goes to waste.
sukru,
Someone’s got to do it 🙂
RAM already is heavily used for caching via the file system. That was the point of my example… the 42GB unused that I cited could be powered down because even with caching my system doesn’t need it all right now. It would be nice to be able to power down that ram until the times when I actually need it.
When you say “just use more ram, you already paid for it”, I still think it sends the wrong message that software developers should use more RAM just because the hardware is there, this is how we keep getting more bloated software.
Thinking about this some more, it resembles the “memory ballooning” concept wherein a guest OS running inside a VM relinquishes memory back to the host.
https://www.humblec.com/memory-ballooning-and-virtio_balloon-driver-in-qemu-kvm/
Of course in the case of VMs ballooning is used to prevent VMs from needlessly locking up unused memory when they don’t need it. But it’s conceptually similar to a host powering down unused memory. The OS could move memory out of banks to power them down. Thanks to virtual memory this is already completely transparent and software doesn’t care about where it exists in physical RAM. So if the hardware supports it, I don’t think it would be so hard for linux to support it on the software side.
The atrocious design of Windows 3.x caused problems for Windows95.
I wonder – how did OS/2 deal with all these Windows 3.x apps overwriting system files???
Interesting question. I wonder if it did at all.
I mean… OS/2 Warp 3.0 required installing Windows from Windows installation disks.
So, one could say it was just running Windows on top of OS/2 in stead of op top of DOS.
If the Windows part broke, OS/2 would still be running fine.
For Windows 95… overwriting some file in C:\WINDOWS\SYSTEM could bring the whole OS down..
Atrocious design? I used WIndows 3.x extensively, I wrote applications for it. It was amazing for what it was. It ran 16-bit code, it also ran 32-bit code, the memory management was amazing if you coded correctly. The OS ran in 2MB. It was very stable if you ran stable applications.
I would leave it at “atrocious reality”. Nowadays it’s impossible to fathom a serious operating system that does not implement any kind of protection on system files modification — back in 1990, for a DOS-based system, all you could do is to _ask nicely_ for users not to modify your system (by setting files to be System and Read-only, even Hidden)… but it was just a matter of giving the right switch to the syscall, and DOS would happily flip the bit or ignore the attribute.
Yes, I find Win3.x amazing compared to what we do today, compared to how large would a similar system be if built with today’s tooling. I drool at the complexity of AmigaOS and Workbench implementing fully preemptive multitasking in 512KB RAM and 7MHz. And don’t get me started on GEM on 8-bit, 64KB RAM machines. But there are advances with current systems that are just unfathomable with what we had back then.
It was for a time when floating point math co-processor was not a standard feature, but an expensive add-on, RAM was 1MB, rendering truetype fonts meant you’d be watching text drawn in slow real time.
The fact that it even worked is a great feat. Windows 3.1, especially on 386 mode would even have some semblance of a virtual DOS sessions where programs thought they were in full control, while multiple DOS programs were running concurrently. No other system at the time could do it. Not even OS/2.
(Another reason OS/2 could not catch up. I don’t remember it being stable running DOS back then)
But of course today, it would feel atrocious to use low quality bitmap fonts, not having the window contents draw when we move them, but only draw a simple wireframe, or memory not having full protected.
Let them replicate this on an STM32, and see how much of an egineering marvel Windows 3.1 was.
sukru,
In those years engineers had to do more with less, we’ve clearly lost that touch.
But I’m honestly not that impressed with microsoft supporting multitasking DOS because other operating systems already had much more sophisticated multiuser multitasking capabilities long before windows. Microsoft always needed to play catch-up on so many technological fronts. If not for IBM handing microsoft all their customers, microsoft would have been much less likely to make it on merits. Windows would be a topic for obscure technology history buffs on osnews 🙂
Alfman,
Why do you think those technically superior operating systems like OS/2, or desktop environments like GEM failed, but Windows stood the test of time?
What would a user do, who paid for Lotus 123 DOS, “Accounting 3.0”, or SimCity realize this works perfectly in a window (or full screen) in Windows 3.1, but cannot be done in those better operating systems?
Or they had only 2MB or RAM, and “real” multitasking requird 4MB and a 386DX?
sukru,
For most users x86 was microsoft. But you’re view is too x86-centric, a lot of innovation was actually copied from elsewhere.
Alfman,
… and yes, innovated elsewhere, and copied back to x86, like Intel having all their CPUs with RISC cores starting with Pentium
The reason is practical, not ideological, though.
None of the competitors offered “reliable” business machines at PC prices. Even IBM itself fell off that race.
(I love what Amiga tried to do, but they were not good outside media production. And even there they fell behind, since they did not have a response to VGA. Sorry, they could not even catch up with EGA).
If you are good enough and cheap enough you dominate the market.
Again, why would someone invest in a 2x expensive machine that cannot run the software they paid for, just to have technical superiority?
sukru,
By the time microsoft had established a monopoly on software, it was game over for everyone else, even IBM. I’m just saying the microsoft monopoly was not a product of their own innovation. The IBM deal enabled microsoft to jump over the part where they would have needed to compete with others. In all likelihood they would have failed on their own. Had IBM gone with someone else, that someone else would have won.
Alfman,
It is a bit of luck, and a bit of talent.
But one creates their own luck with talent. Without being able to write an operating system (DOS)*, Assembler, multiple compilers (C, FORTRAN, COBOL) and actually delivering them on time, all the luck in the world would not have mattered for Microsoft.
DOS*, yes they adopted QDOS
Microsoft got a PC monopoly before developing any of those. Microsoft compilers were disliked by many. I can acknowledge they hired talented people, but they were still behind the competition technologically. The most significant factor putting microsoft ahead was their IBM bundling arrangement. It was exclusive.
Alfman,
Microsoft got a PC monopoly precisely because they delivered those…
Microsoft was a compiler / interpreter developer. IBM came to them as their CP/M deal fall through, and urgently needed an operating system and compiler suite. For Microsoft, it was the starting point of their journey.
They delivered MS-DOS, which is heavily based on QDOS was their first OS
There was literally no PC, nor MS-DOS before then. They created this monopoly by delivering MS-DOS, BASIC, Assembler, FORTRAN, C, and COBOL to IBM, and having a non-exclusive contract.
sukru,
They didn’t even have the right to the product they sold to IBM that became the basis for their monopoly. Of course microsoft would go on to establish more roots but this deal was THE reason microsoft became the industry standard.
I think you may be hinting that microsoft could still have sold the OS the other PC clones even if IBM had fallen through. but the clones wanted MS DOS specifically because that’s what IBM computers were using. It’s IBM’s promotion of IBM DOS (ie MS DOS) as a computing platform for PCs that gave microsoft a nearly instant monopoly.
I agree the contract terms weren’t exclusive, but what I mean is that the deal to have ones OS bundled with all IBM PCs was exclusive to microsoft. IBM made MS DOS the standard, a benefit nobody else had.
By most accounts, the consensus is that this is what enabled MS to dominate PC operating systems.
Alfman,
I was not around that time, but from what I collect reading
1 – Micro-Soft was a developer tools company (BASIC mostly, but also FORTRAN and other compilers on various platforms
2 – IBM reached out to Micro-Soft to get those compilers for their new “PC” pet project. (PC was never a “thing”, so they probably had some low level people looking into it)
3 – Bill Gates recommended them to use CP/M for their system (which Micro-Soft already supported and was the industry standard)
4 – CP/M were greedy and inattentive. The deal fell through
5 – IBM went back to Micro-Soft… they were in a hurry nearing the release of the “PC” toy…
6 – Micro-Soft pormised to solve their problem
7 – Micro-Soft got exclusive license to QDOS (“Quick and Dirty Operating System”), debugged, fixed, and optimized it for release. (It was *not* ready to take over the world, yet)
8 – Micro-Soft gave these for free to IBM with the condition to keep non-eclusivity for any other PC vendors
9 – PC not being a thing, and their hardware not being easy to clone, IBM did not care and approved
10 – The rest is history.
sukru,
My point was that micorosft would not have been so dominant without IBM. However I do notice your bullets don’t mention the fact that Bill Gate’s mother was working directly with IBM’s CEO and is the one credited with making the connection.
https://en.wikipedia.org/wiki/Mary_Maxwell_Gates
Bill gates may have never gotten the job without Mary Gate’s inside connection to IBM’s CEO. You can speculate, but history does not show how things would have turned for microsoft without such fortunate advantages.
Alfman,
I’m not saying there was no luck involved.
However IBM did not go to Micro-Soft (back then) for an operating system. They were called for compilers and interpreters. And they were king of BASIC back then, and had a good portfolio of a few others.
No mother would be able to say “hey, hire my son who has no experience”
It was more of “I heard you are making a computer. My son is the best for having BASIC on those” or whatever way she sold the idea.
IBM first went to CP/M (Digital?) for the operating system,. Again this was proposed by Microsoft.. And then that fell through. Only then they took chances with Bill Gates, which had zero OS experience.
Yes, they would probably never try writing an operating system if it was not forced upon them. They were happy selling programming tools on all existing machines they could find.
sukru,
Of course she wouldn’t have said that. But you seem to be boiling down my point to microsoft not being skilled, which isn’t my point. What I am not saying is that microsoft didn’t have any skills, but that it was only mediocre compared to what others were doing. In the early years Microsoft had grown into a recognizable pattern of copying innovation rather than creating it. Hey it all worked out in the end, however technological exceptionalism was never the reason behind microrsoft’s monopoly,. If anything history shows that being business savvy was more important than software skills were. Lying to get a contract…hey not everyone would do that, bravo. Having family connections isn’t a fair advantage either, but it’s a traditional that goes on to this day.
Well then its interesting to note that even microsoft, the monopoly, did not have the most desirable developer tools, that went to bordland. If microsoft were to distinguish itself only on developer tools, they would have needed to be better than they were to be seen as the best on merit. But of course we can only guess as to how it would have worked out.
Also it didn’t come up here but we might as well get to it now. The Gates family were already relatively wealthy, which meant that microsoft never needed to rely on outside investors or borrowing money. Because of familial fortune, Gates could afford to give up immediate payback in exchange for long term market-share. This strategy is proven to be extremely advantageous over competitors opening a clear path to success. Amazon and walmart are other big examples, but there’s no denying it’s a very unfair advantage and only works if you were wealthy to begin with.
Alfman,
Of course. I did not say they were innovative.
I’d say both are important. You always need a Jobs next to your Wozniak.
Borland was not a thing when this deal happened. Borland only became viable thanks to Microsoft building DOS for IBM PC.
Let’s say upper middle class.
Yes, not needing outside money is good. But they are not unique.
If you think about it, the media inheritance people get in the USA, is much larger than what most of billion dollar companies started with. However 99.99% of the people do not risk anything for those long term entrepreneurship.
It is the difference between: “my mom left me a million dollar home, I can now retire”, vs “my dad gave me $200k, I can now start my independent business”
The first person can always sell the home, downsize, and invest in their own company. But very few people actually believe in themselves.
If they did, we’d be in a much better world.
Btw, I found a more detailed account of the events
What was the saying? TIL
https://www.pcmag.com/news/the-rise-of-dos-how-microsoft-got-the-ibm-pc-os-contract
sukru,
Given that you are making a hypothetical where microsoft succeeds without MS-DOS, I find it disingenuous to be making the case the competition couldn’t be viable without MS-DOS.
My point isn’t that microcrosft wouldn’t have existed at all without the IBM deal, but it is the IBM deal that gave them such a decisive monopoly. I don’t know why you’re pushing back on this notion because it isn’t typically controversial.
They weren’t billionaires, but they were multimillionaires.. This is probably going to sound offensive, but every time we’ve discussed matters of wealth I always get the impression you are very out of touch with the common man. To be fair, everyone here probably has some degree of privilege, but Gates’s family wealth would still be a lot today even without 50-some years of inflation.
https://finance.yahoo.com/news/average-american-inheritance-wealth-level-130120356.html
I also feel that more people should be liberated from corporate jobs, however lets not sugarcoat what this is like. Long days and stress are common, and though everyone hopes for a long term rewards, the statistical reality is that the most common path for small businesses is they end up struggling and even failing. Sometimes they just aren’t cut out for it, sometimes the banks don’t want to greenlight loans because owners don’t have enough collateral, sometimes giant competitors have squeezed the market and competing is just not viable. Save a few outliers, those who manage to get jobs working for giant corporations generally have a better economic outcome than trying to compete in the small business space. I strongly dislike for things to be this way, but when the world’s wealth and opportunities are gobbled up by giant corporations, there’s just not much left at the bottom.
Alfman,
I don’t think I said that.
In fact, I said them making a successful operating system was pretty much forced upon them.
That is 25% of the people in the USA at least once in their lifetime (the wealth peaks around 55)
Because, I came from actual poverty. “Cannot buy bread today” poverty. So, I really appreciate stories of upwards mobility.
I agree. That is why I have tried several times to fund my venture.
… and failed. Finding reliable customers and having a good cashflow is not easy.
That is why it is important not to pull the lower rungs of the ladder. I believe it is still possible to have upwards mobility, and I see it often around myself.
My friend who immigrated here with almost nothing is now running a very successful mobile advertisement company getting contracts from those very large companies today.
Another one is working on a specialized device that is sold to large retailers.
And one is not that big, but has a successful data analytics consultancy.
Things are possible.
(Wow, how did the topic drift here)?
The visual design was amazing.
The Program Manager is basically iOS. Program groups and application launchers, and a fully program-centric approach.
You started being able to toss files and folders around the desktop with Windows 95.
Except that the inspiration everyone was comparing it to because of the empty “desktop” behind it was a file-leaning hybrid UI (classic Mac OS) quite close to its fully file-centric ancestor (Lisa OS) and they didn’t let people customize the program group icons.
Cathode Ray Dude’s summing up of the impetus for Norton Desktop lays the problems with that out pretty well.
He also indirectly touches on one design element of the original MacOS that was lost with the switch to hard drives.
Because of how the desktop was implemented, you got “session saving” for free by putting each project/session on a different floppy disk. That’s part of what the “Put Away” option in the Finder menu was for. You moved what you were actively working with onto the desktop and each floppy disk had its own desktop folder and they would remember where their contents came from so you could select “Put Away” when you were done and they’d snap back to where on the disk you intended to permanently store them.
Paired with things like customizable icons for everything and resource forks to turn every binary into its own asset packfile so skilled users have to ask to see programs subdivided into their component parts, it really was the best effort we’ve ever seen for making the file-centric paradigm solid and novice-friendly.
Heh, I never knew about this, but I did hack quite a bit at win95 “back then”. I was the sysadmin for a large school; we had kids of all sizes, and it was hellish to ensure all systems had sane settings day after day… So I hacked up a script to run a boot that would connect to my network server, fetch the right™ Registry, Start menu hierarchy, desktop background and some extra bits. It was seen as magic by the kids, as they were able to change anything on the computers, but rebooted and things were back to normal!
Of course, it was not absolutely foolproof, and any teacher wanting to install any piece of software needed my help and my blessing… but it was more than convenient, and more than enough given my users
gwolf,
Yes, being a windows sysadmin meant you needed to improvise.
We went one step further. Had Netware based netboot from a virtual floppy. Wipe drive. Download a clean version of windows. Configure the local IP address and a few other settings, and have a locked down student account…
(Or rather partition. C: was read-only, and D:? I don’t remember was their scratchpad)
And it somehow worked. Was complaining about having legacy drivers (which means 32-bit I/O was not possible). But it was a compromise for having a good clean setup.
i distinctly recall a pretty bad bug with similar mechanism. maybe it was in windows 98 ? not sure anymore.
basically the os scanned the disk for dll files and if it found a more recent one – it would use it instead of its own, with no sanity checks whether it’s compatible.
having a dual install of win xp + win 98 (IIRC) would make win98 self-destruct. at least i think it was win98 thing.
edit : or maybe it was the case of having win95 + win98 installed side-by-side.
Thank you for explaining OS/2 required you to install Windows 3.x to run the Windows software. And if it broke Windows, OS/2 kept going without any problems.
The fact is Windows 3.0 was atrocious design. If Microsoft had any care about security at all, they could have implemented very simple security for win16 native apps. e.g. must install to C:\\SOFTWARE\\xxxxxxxx, shared library files go in C:\\LIBFILES, no access to C:\\WINDOWS, no access to C:\\ root dir. Could have blocked DOS apps from those folders as well, when run from Windows. Default folder when using open file dialog is C:\\DOCUMNTS
Now because of Microsoft incompetence, it still impacts on Windows today, where application installers are notoriously failing to uninstall properly and leave junk behind.
tom9876543,
Do you know Windows 3.0 (and 3.1) runs on 80286, which does not have proper hardware support for a good security (MMU), and supports 1MB RAM, which would not give too much leeway to do those anyway?
And there was no “install” back then. There was not even a registry (in today’s terms). It was the user who copied .EXE files manually, or chose destinations in case of slightly larger ones. There was not even a guarantee a C: drive existed.
It is a different story when we ahve the benefit of the hindsight, of course.
(To be honest, Windows 3.0 was the worst operating system of its era. It was not even technically an operating system even.
Windows 3.1 fixed a lot of things, and probably deserves to be called an OS at that point, but the foundations were already set)
sukru I don’t know why you keep defending Windows 3.0, seems like you are a desperado fanboy.
CPU Hardware protection on 286 is irrelevant. Microsoft win16 API documentation could have said the security will be enforced on 386 and later CPUs, so your software must follow these rules.
Your statement the 286 CPU would be too slow to do the checks is very dubious. The win16 kernel could refuse to load an EXE/DLL/COM/BAT file if full path does not start with C:\\SOFTWARE or C:\\LIBFILES. Let me edit my previous idea and also allow C:\\INSTALL as well. Even a 286 CPU could do a string comparison.
Your statement “C: drive may not exist” is laughable. Windows 3.0 specifically required a HDD with at least 6MB free space (Wikipedia). Even if Windows was installed on another drive, its obvious my folders idea would apply to the installation drive. If Windows in installed on G:, then G:\\ , G:\\WINDOWS , G:\\SOFTWARE, G:\\LIBFILES and G:\\INSTALL are the “special” folders where the win16 kernel does extra security checks.
Mcrosoft could have made a small amount of effort to think about security when they were creating Windows 3,0. Obviously Microsoft’s priority was $$$$.
tom9876543,
Fanboy? Really? Why do we usually stoop to personal attacks on technical matters.
I have actually used Windows 3.0, and a lot of Windows 3.1. So, speaking from experience.
Interesting, I must be hallucinating then, since in our lab not only we did not have HDDs (except one machine) we did not have floppy drives either. How? Things booted from a network drive (Novel Netware 2.0, IPX networking over Coax)
I actually wrote some hobby kernels. And know about 286 hardware.
Nobody* wrote protected mode kernels for 286. Do you know how to exit protected mode on it to run DOS? Please look that up
It is not about speed, it is about capabilities.
Perfect. You are inventing a “modern” operating system that will be rejected by 95% of the users who do want to put files wherever they want.
Remember, there were no proper installers back then. It was actually the user who did the management.
(Or the system admin)
I put them in F:\\Games, C:\\DOS, or whatever my heart desired. If Windows were to block me, I would switch to something else.
Btw, Setup.exe? It was just an unzipper.
sukru,
Yeah we shouldn’t resort to that.
I didn’t know that was an option. Seems a bit abnormal, I guess you saw this setup at a business? It predates my experience.
I swear I’m not looking it up….but triple fault CPU reset! Then the bios checks a value in memory (near 0040:00XX?) to continue execution rather than completing a full BIOS boot.
… I wonder how many decades this will stay with me, haha.
I do think they could have done more to support file system security capabilities the way other operating systems like VMS and Unix had been doing, but at the time microsoft only saw windows as an extension of single user DOS, so in that context I guess they didn’t care about security, at least not until NT a few years later.
I found it easier to manage files on those older operating systems. Current versions of windows seem to want to hide file locations for everything and it’s such a pain to customize anything anymore. In interactions with normal users I can clearly see that microsoft’s changes to explorer are confusing as heck to normal users. They aren’t using focus groups to design operating systems like they used to.
Literally unzipping software would have been much better than setup.exe IMHO. I hate application installers and the act of using executables to do installation would forever change the course of windows history for the worse. At least early software didn’t use the registry, but now you need an installer.; unzipping is no longer sufficient with most software.
I’m not really a fan of MacOS, but they solved so many problems by not going the windows route for software installation. Even to this day windows installers are a headache and sometimes break under wine, haha.
sukru,
You claimed that basic security checks on folders wouldn’t be possible because 1MB RAM doesn’t “give too much leeway to do those” checks.
I responded that your claim was very dubious. Even a 286 CPU can do a string comparison to check if full path begins with C:\\SOFTWARE or C:\\INSTALL, before loading the exe/dll/com/bat.
Your response was “Nobody* wrote protected mode kernels for 286. Do you know how to exit protected mode on it to run DOS? Please look that up”
A stupid response, you deflected to another topic because you know you’re wrong when you claimed 286 can’t do basic folder security checks.
Congratulations for running Windows off a network drive. My basic point stands, the new Microsoft Windows 3.0 with new win16 api could have done rudimentary basic security checks on folders on the installation drive.
As Alfman said, MacOS has proper design for software installation. I’m not an expert but I believe Classic MacOS9 had “Applications” folder and software was all installed in that folder. Unlike the total mess that is MS Windows.
Alfman,
Yep, never a good sign
It was the school lab. One morning all our floppy drives were gone to be replaced by netboot.
We had 3.5 floppies back then, but kids being kids, they started bringing games and other questionable material. The computer teacher took to himself to make sure this did not happen again.
And I leaned a bit more back then (including how to hack, but that is a another story. No, no, we definitely did nothing wrong)
There was a shared drive (say F:) and home drives for each user (say H:)
The Windows install was on F:\\WINDOWS and F:\\WINDOWS\\SYSTEM. Each student who wanted to use it would copy the WINDOWS directory (win.com and all .INI files). And there was one magic I don’t remember that told it to use the main one. Then you’d have write access to the “local install”, the INIs, but the SYSTEM was read only and shared.
It has been decades, so some details are missing.
Let’s not get into how they used the keyboard controller chip for this 🙂
Unfortunately it will stay with all of us. 80286 is the worst designed Intel CPU, even including the buggy Pentium ones. It is something that should have never been, and its design is poisoning modern computing to this day. At least nobody actually used 80186.
Even their 8051 microcontrollers had much. more reasonable design and better memory safety (though no MMU)
Yes, they went from basic Unzip to a directory to entangling their tentacles everywhere in the system like a cancer organism.
Should have never happened.
tom9876543 ,
I never claimed that.
Security in operating systems is a fascinating topic. I would recommend reading on protected memory, multi-ring kernel design, file system security, FAT file system, and crucially how DOS and BIOS worked back in Windows 3.0/3.1 or even Windows 95 days.
Capabilities of CPUs for operating systems is an entirely different matter than being able to do string comparisons.
Was released in 1999. Windows 3.0 was released in 1990.
It’s a supported configuration, described in the Windows Resource Kit for Operating System Version 3.1, starting on page 73, the Windows for Workgroups Resource Kit for Operating System Version 3.1, starting on page 3-17, and briefly mentioned on pages 2-6 and 2-7 of the Windows for Workgroups Resource Kit Addendum for Version 3.11. (No doubt inspired by NFS-mounting /usr)
You run `setup /a` (short for Administrative Setup) to unpack the assets onto the server and mark them read-only, and then connect to it from the client and install the bits that go in the user’s local non-read-only filesystem using the copy of `setup` installed to the server, selecting “Network Setup” instead of just a non-sneakernet normal install, either by editing `setup.inf` to set `netsetup=true` or making sure that it always gets run as `setup /n`.
It’s then followed by instructions on how to run an automated setup using a config file to pre-specify the settings it would prompt for.
Bear in mind that anything they do was also being compared critically against contemporary versions of classic Mac OS (i.e. System 6 from 1988 and System 7 from 1991)… and Classic Mac OS is “so good” that you can boot it off a floppy, make a new boot floppy just by dragging and dropping the system folder, put applications wherever feels good to you and still have them work, move applications around without breaking file associations (most of the time), and, when the file associations do get corrupted, you just hold down a key combo while booting to regenerate them.
ssokolow,
It has been so many years since I heard Windows for Workgroups.
WfW was a real weird one in their lineup.
sukru,
What is your source? I only found data for the 50% median and it was not close.
https://www.fidelity.com/learning-center/smart-money/average-net-worth-by-age
Once you factor in inflation, the Gates family was north of $15M….if you want to call that lower upper class, fine whatever. But we need not pretend that Bill Gates wasn’t extremely privileged with wealth and family connections compared to the middle class.
Stories of mobility can be inspiring. It’s fine if this motivates you, but at the same time stories can skew reality. Data and statistics that include everyone are a better at representing the world. Of course this data may or may not portray reality in a way that makes us feel good about society. Often it does not, unfortunately.
Well, considering your work history, I would have assumed you were already a millionaire. Did you really spend it already? Haha.
I’m sure you’ve heard this before, but it’s about having good connections. Make sure your connections are able to provide the quality of life you are looking for.. You can work hard and do an excellent job but that does not help you in business nearly as much as having the right connections.
I never had great connections, clients don’t pay well, and my CS skills are underappreciated. I find that pigeonholing is a significant factor. Career paths follow gravity wells in that once you get some work, it’s easy to get more of the same but much harder to get different work.
Yeah, sillicon valley is where the unicorns live.
Alfman,
I believe this is the wrong thread, and we really took a long detour, so I’ll keep it short.
You need to be “T-shaped”
There are different names for this, but you need to have deep knowledge on one skill, and broad knowledge on everything.
That is a hard requirement of success these days.
But not enough…
You also need “soft” skills.
It usually is a mix of seeking out opportunities + luck + seeing what is going on around you.
After a certain level, your “technical skills not being appreciated” is a common pattern. Most people don’t care. However if you are known to solve problems, and be good in one topic (whichever that is), then you can start tapping into those connections
“Who you know”
For most technical people this is hard, for two reasons
1 – We are stubborn (see my history here)
2 – We are humble
Both work against us, but as an engineer, possibly with some social anxiety and being on the “spectrum” (comes with the job)… we need to learn how to communicate with people.
This might require being “out there” and even moving your locations to where the jobs are
The first part is usually more difficult. Going up against our nature and talking to people… And also not “correcting” them
“Hey, I believe you have a problem I can solve. I heard your school is looking to integrate the phone system online. I have prior experience with Asterix and PBX systems. I can help you get that set up. And then we can see where we can go from there”
The second part… we work against ourselves.
“Yes, Asterix is a great solution. But of course it has some issues. You need to choose a PCI card with known reliability. It can also cost money. And for…”
And we already lost the job.