After the release of Windows 95, with its brand new and incredibly influential graphical user interface, it was only a matter of time before this new taskbar, Start menu, and everything else would make its way to Microsoft’s other operating system line, Windows NT. The development of Windows 95 more or less lined up with that of Windows NT 3.5, but it wouldn’t be until Windows NT 4.0, released a little less than a year after Windows 95, that NT, too, would have the brand new user interface.
Raymond Chen has published a blog post detailing the cooperation and interplay between the Windows 95 and Windows NT teams, and, as always with Chen, it’s a joy to read.
Members of the Windows 95 user interface team met regularly with members of the Windows NT user interface team to keep them aware of what was going on and even get their input on some ideas that the Windows 95 team were considering. The Windows NT user interface team were focused on shipping Windows NT, but they appreciated being kept in the loop.
During the late phases of the development of Windows 95, the Windows NT side of the house took a more active role in bringing the Windows 95 user interface to Windows NT.
↫ Raymond Chen at The Old New Thing
Chen details there was a lot of code-sharing, to the point where the Windows 95 version of the GUI contained NT-specific code, and vice versa. This code-sharing was quite a lot less elegant than today with tools like git, since Microsoft’s own internal source code system called SLM (pronounced ‘slime’) did not support branches, so they had to regularly perform three-way merges manually.
It was a different time, for sure.
Anyway, it’s amazing how much of this ancient Microsoft lore could’ve been lost to time, or shrouded in mystery, if it wasn’t for someone like Raymond Chen regularly sharing the stories from Microsoft’s past.

It’s so amusing (ironic?) that the site hosting this blog post is not responding due to the current major Azure outage.
Drizzt321,
Huh. I hadn’t heard about it. For me the linked site isn’t completely offline for me, but extremely slow to the point resources are timing out,
AWS outage and azure outage in one week…a Google outage would make it a hat trick!
There’s still time!
Coming up quick for me on the MS site now. Might be you didn’t try it until things starting getting better.
Microsoft’s internal source code system was pronounced ‘slime’…. how appropriate.
This also demonstrates how Linus Torvalds is superior to the Microsoft programmers.
I believe Linus designed and built the git application almost completely by himself.
The best Microsoft’s programmers could do was a ‘slime’ that didn’t even have branches 🙂
Why wasn’t Microsoft with its millions of dollars able to build a “git”???
@tom9876543
Microsoft mostly builds “good enough” versions of whatever they are competing with. Microsoft was competing with centralized version control so they built Microsoft Team Foundation. It is what you get when you try to build CVS with millions of dollars.
Git was built out of necessity by Linus because he did not want to try to manage Linux kernel development (highly distributed) using centralized version control. Git is a distributed version control system because that is what Linus needed.
It is a great example of what can be achieved when technical people scratch their own itch.
LeFantome,
Git was a fantastic evolution.
Linux used to host their code on a proprietary system (let me look that up: BitKeeper… it has been a while). When that was arranged, the BitKeeper owners cut a deal to have a free client for Linux development. Basically this would give them more exposure (free advertisement).
However people started to clone BitKeeper (why not? this is open source after all), and that partnership became sour.
Linus being the nerdy hacker, locked himself up in a place, built the very early version of git, and the rest is history.
Linus demonstrates why having the founders / early engineers are essential for longevity of organizations. Similarly, Bill Gates himself famously locked himself for about two weeks to get the initial MASM (Macro Assembler), which is a nice piece of software. Original Google develoeprs Jeff Dean and Sanjay were crucial for fundamental Google internal software. And Wozniak and Jobs were necessary for the initial touch at Apple (remember when they brought in a “Wall Street” CEO?)
That uncompromising, deep understanding of your project is essential.
You need to keep that spark somehow, or you become another generic project / company / organization.
@sukru
I am familiar with the BitKeeper saga. Pulling the “free” (but proprietary) license did not work out very well for them. Ironically, BitKeeper is now exclusively Open Source (no longer sold).
https://www.bitkeeper.org/
I had not heard that Bill Gates wrote MASM in two weeks. That is a JavaScript in 10 days level story. Serious props if true. But I thought MASM was written by Bob O’Rear.
LeFantome,
It’s ironic given MS dominance as a software company, but DOS dev tools was never their forte. During DOS years people widely disliked MS tools compared to borland, which was really the gold standard at the time. No idea who wrote it, but MASM specifically had a reputation for ugly MS specific hacks and quirks. The 3 companies I did assembly work for all preferred to use TASM for their own code. Now I would use the open source NASM for new assembly work, which hasn’t been that much work but it would still be my choice because it’s open source and portable.
LeFantome,
It was a famous story back in the day… But I looked it up (there was no Google back then)
Turns out it is a mixture of different stories. The original one is Him and Paul Allen spent two months writing the code for Altair BASIC, after selling them an interpreter they don’t actually have.
(They called MITS and claimed to have an interpreter they are yet to write).
This business tactic comes up often for Microsoft. They also sold IBM and operating system, but later bought QDOS and adopted it for this purpose. And yes they sold MASM before writing that as well.
That is why the stories seem to get mixed. They used the same method. The sold a package of BASIC (they had), Assembler, FORTRAN, Cobol and Pascal to IBM for their new operating system. (Not sure what the fate of the other 3 were).
Is it still impressive? Yes. But probably not as much.
Seems to be documented here: https://en.wikipedia.org/wiki/Altair_BASIC
Aha…
https://winworldpc.com/product/ibm-fortran-compiler/100
https://www.pcjs.org/software/pcx86/lang/ibm/cobol/1.00/
https://www.pcjs.org/software/pcx86/lang/ibm/pascal/1.00/
IBM Pascal Compiler (written by Microsoft)! Ouch, They seem to actually exist and work (had to change the floppy in that emulator, but even then, impressive)
Microsoft (and Bill Gates, and others) were really productive in those 1980s era!
Alfman,
Yes, TASM, and later NASM were my primary choices when I got access to hem. MASM was only when absolutely necessary.
You’re commenting what was in early 90s, knowing what happened many years later. Today source control is seen as manatory even for smallest SW projects, but at that time even huge projects could live without it. You should try what was at that time accessible, VCS, CVS, or for huge pile of money ClearCase… You wouldn’t like any and possibly just work without it. For many years at the same period brilliant Linus managed work on Linux without version control software (he used patches delivered in plain text emails). Brilliant guys that had made Doom also hadn’t used version control other than shared drive an synchronization between team members done verbally in the office :-). You can even say Microsoft was better, because they have some form version control 🙂
The downside of Git is that now it is very easy to ship software without properly testing features. This “if it builds, we ship” mentality externalizes the development costs onto the customers.
I’d rather really have Windows (or any OS) receive small patches and then have a large, reasonably tested release, every few years, than having the feeling that we are continuously changing engines mid-flight.
Git allowed C-level suits to push for unfinished releases. Look at the game industry? At the time of ROM cartridges, you hardly ever saw system-breaking bugs. Now you pay for a game and you can only play it properly after a year.
It is highly disrespectful.
Shiunbird,
I’m not sure how git enabled shipping unfinished software more than being able to distribute online updates and “day one patches” (for games).
Especially more so as more and more git repositories are taking on automated testing systems like pre commit hooks, linters, and test coverage checks in their process.
Again, blame the c-suits, but I’m not sure you’re blaming the correct tools.
sukru,
I’ve lost many hours of work because of a git bug with visual studio. Somehow stashes in VS can get out of sync with git and it doesn’t tell you when these errors happen so there’s no cause to assume anything is broken. But then you go to apply them suddenly git starts reporting unrecognized branch errors. These could be visual studio’s fault rather than git, I don’t know. Restarting VS gets everything back in sync, but the horror is that stashed changes are gone! I work with different branches of the same project concurrently, maybe it could be related? This has happened to me twice and now I’m afraid to use git’s stashing feature because of the data loss it has caused. On the one hand I’d praise git for the features it offers, but honestly it stung me quite badly and once you loose uncommitted data to the git gods, it’s really hard to trust it again.
The advice I wish somebody else had told me: commit often and don’t rely on stashes to hold big changes.
Alfman,
Try
To see all activities in the git logs (stashes are special local commits)
And check for entries that look like “WIP” or “stash”
Once you identify the one you are missing, then you can do:
But… first check
in case it is just lost in the stack.
And… just to make sure, I asked Google’s Gemini to verify what I wrote.
He agrees:
https://rentry.co/dzvbz2vh
sukru,
Thanks. Yep I searched the web and did those things, but stashes didn’t exist and reflog came up with changes before and after the stash, but not the stash itself. I don’t know how it could happen unless it had something to do with running multiple instances of VS. Ironically I was better about taking manual backups every day with unmanaged projects, but since I had gotten into the habit of relying on git, I wasn’t really prepared for it to loose my code stashes. Nothing beats consistent proper backups, which I do with my files on my linux machines, but I wasn’t backing up the windows laptop provided to me by my job. That’s the way these lessons go though; live and learn.