A number of years ago, the Computer History Museum together with Microsoft released the source code for MS-DOS 1.25 (very close to PC DOS 1.1) and MS-DOS 2.11. I never did anything with it beyond glancing at the code, in no small part because the release was rather poorly organized.
[…]The obvious gaping hole is the lack of any source code for IBMBIO.COM. I do not know exactly what arrangement IBM and Microsoft had at the time, but in the days of DOS 1.x and 2.x OEMs did not get the source code for IBMBIO.COM/IO.SYS suitable for PC compatibles.
I toyed with the idea of writing my own IBMBIO.COM replacement, but eventually gave up because it’s not a totally trivial piece of code and I had no real documentation to work with (until much later). The MSDOS.ASM source code obviously uses the IBMBIO interface, but makes no attempt to document it. The provided IO.ASM source is quite useful, but SCP’s hardware was different enough from the IBM PC that it is of limited utility.
So, disassembler it was, and I produced reconstructed source code for PC DOS 1.1 IBMBIO.COM. Actually assembling it turned out to be a bit of an adventure; more on that below.
More early DOS shenanigans to brighten your day.
I made the same observations for myself (although a few years later). It seemed like masm was used alot simply because it had microsoft’s backing, but borland tasm was cleaner and so much less frustrating to use. The same thing was true of borland C and it’s IDE. Microsoft’s tools were bad. IMHO if microsoft didn’t have it’s early IBM monopoly to help them defeat rivals, they would not have taken the top position in the computer market. It was other companies that drove innovation and had better products. Business isn’t fair, such is life.
Back in the day memory optimization tricks were absolutely crucial. Today, many software publishers don’t bother optimizing even if it wastes tens/hundreds of megabytes because they don’t think it’s worth a developer’s time to optimize. Just throw a few more gigabytes onto consumer devices, haha.
Alfman,
Yes, the IBM deal helped them rise. But I believe it was their price advantage that made the break for Microsoft. Even today, Delphi has the cheapest option at I think over a thousand dollars, a very limited starter edition. Visual Studio on the other hand has a full fledged free version. Actually two (Community and Code).
Back in the day, QBASIC came free with DOS. Before that GW-BASIC was free for IBM machines. Before PCs, Microsoft had ROM BASIC for everything on the planet (including TI calculators).
They knew they could win by going cheap at scale. And once people were hooked up, Visual Studio costs a lot, along with the Server edition of Windows.
Borland was ahead of its time. So was Novell. And many other software competitors. But they were expensive, and Microsoft always won on price.
[ btw, I agree on optimizations. people don’t realize how much they make a difference. ].
sukru,
I would still say their tools were inferior. Still though you are right, it’s hard to beat the marketing value of free or subsidized bundling.
That makes sense. Nevertheless Microsoft clearly had the luxury of a cash cow, which most of it’s competitors did not. MS could afford to bundle things at or even below cost to undercut & outlast competition who could not compete with predatory pricing. It is undoubtedly a very effective market building strategy used by many corporate giants.
Yeah. It pains me when my clients throw away the fast optimized web sites I’ve built over the years. They inevitably replace them with slow and bloated commodity platforms like wordpress, woocomerce, magento, etc, taking a huge performance hit in the process even with more powerful hosting. I’ve accepted the futility in trying to compete with dominant platforms. And once again your argument about costs applies here too. It doesn’t matter that we can write better code if we can’t compete with their free offerings. Customers have become less and less interesting in paying for custom development. So even though I feel I’ve had to lower my standards, I’ve reluctantly joined the club in supporting the bigger commodity players because they have more market share.
I’ve not worked with woocomerce, magento, etc. in a while, but wordpress some good optimizations are possible: Varnish, the right cache-plugin and obviously, don’t install a boat load of plugins. Some hosting providers offer Varnish by default as a selling point.
Lennie,
I wouldn’t say wordpress is bad, not like magento, but without a FPC it’s not fastest thing and the database design is not efficient. It’s not hard for an experienced developer to beat.
Obviously I agree a full page cache can help mask some of the backend performance overhead, but it can also introduce new issues. Like on osnews sometimes I periodically see coherency issues where one page links to a live post but the cache serves up a stale copy that omits said post. Eventually the cache refreshes itself, so the problem is only temporary, but it’s still annoying and somewhat buggy sometimes.
On the other side of the problem caching is a shared resource and there’s only so much of it to go around between pages, or worse, between customers.. On websites where only recent posts attract 99% of the traffic it isn’t too big a problem since everyone is loading the same pages (lots of cache hits) But I’ve seen very poor wordpress caching behavior on websites where there are on the order of 50k+ posts and each page is just as likely to hit. On such websites cache misses are the norm and low and behold users are exposed to wordpress’s uncached performance.
Of course you could just buy/rent more RAM and allocate it all to caching until the entire website fits in cache, but there’s still the issue of stale caches. Also it can take a long time to populate the cache from a cold start. It will work one way or another but the thing is a well optimized website would have been able to perform extremely well without any cache such that the sources could be put to better use.
I’m of the mindset that a developers should initially target lower hardware specs without caching such that the base system is performant and upgrade paths are easy and affordable. However I’ve seen a lot of developers build prototypes on a very high end system to begin with. While this saves them the effort of optimizing, I don’t think this approach does users any favors since they’ll start to scale and the whole thing becomes slow on already fast hardware,
Anyways none of this really matters now, I’ve come to terms with custom development loosing to commodity platforms including wordpress regardless of efficiency.
I’m not an MS fan by any measure but being old enough to remember IT inthe 90ties I recall VStudio being anything but cheap. VSCode is a completely new product that only shares a name with original VS studio that defined windows programming.
Speaking about innovation I also thing we’re a bit unfair for MS, VSStudio have premiered technologies like IntelliSense, had excellent online incontext documentation and made C++ template programming useful for anything.
It went head to head with Borland for years (the two camps were strong esp. in the enterprise developement) suite and both were equally thrown into irrelevance by the web.
Where MS had a slight advantage was office automation with visual basic providing a “smooth” path from macros to makeshift apps. Borland didn’t have an answer for that.
dsmogor,
I was referring to their early years, but I agree that the VS IDE improved over time, IntelliSense worked really well with VB in particular, However not so much with C/C++ code, it was much less refined and I felt it took a long time for MS to catch up.
MS had good documentation, although much of it was behind an MSDN subscription. I don’t recall the exact details, but in one of it’s antitrust suits microsoft was mandated by court order to provide accurate & accessible developer documentation. Thanks to that the MS APIs are fairly well documented and accessible to anyone who needs it.
Visual basic for applications was never anywhere as good as visual basic proper. Having come from VB4-6, this was always a huge gripe of mine. Still, I found it useful to build office macros, but it was very quirky and I for one don’t miss VBA at all or ASP for that matter. I also seem to recall office making macros more difficult to use in later versions, although I’ve stopped using excel so I can’t speak to the current state of things.
That’s not how I remember this playing out. Throughout the 1980s and early 1990s Borland massively undercut Microsoft on price for developer tools. The price of Microsoft’s offerings were truly eye-popping at times. Microsoft did introduce some Borland-compete tools like QuickC and later Visual C++ standard edition, which had competitive prices but included no code optimization whatsoever. It’s also worth noting that Borland forced Microsoft into the IDE business with QuickC – I don’t think Microsoft’s core developer products were bad exactly, but it’s obvious that Microsoft developers used their own editors, because that wasn’t a core part of the developer tool offerings.
Borland started to hit trouble during the mid 1990s. Microsoft’s developer tools continued to be expensive.
The big drop in Microsoft’s prices happened in 2004 with the release of the free Visual C++ toolkit, later followed by the Express Edition, free compilers in SDKs, and then community edition. Another quiet change is Visual Studio 2005 standard edition (ie., the cheap product) included an optimizer. By 2004 Borland wasn’t a consideration – these changes were driven by gcc and mingw. People could get an optimizing compiler for free, so the era of charging for compilers ended fairly abruptly.
It may be true that GW-BASIC/QBASIC were free, but did that really bring developers into Microsoft’s ecosystem? It’s not like these were particularly close to the commercial Visual Basic lineup.
Speaking of MASM specifically, it retailed for $US200 in the 1980s (in 1980s dollars.) Lack of people wanting it as a standalone product led to it being quietly folded into Visual C++ 6.0 as part of the “processor pack” (around 2000) although it was limited to the still expensive professional edition. Later it became bundled into all of the free compiler offerings (I don’t remember exactly when.) Now the vast majority of people with MASM installed don’t even know they have it – but that occurred in the mid-to-late 2000s.
malxau, and others,
re: visual studio vs gw-basic.
For my case, GW-BASIC is how I started programming. Almost 30 years of programming thanks to Bill Gates (ironically I dropped their offer to work for a competitor, but that is another story).
For those of us starting out, first GW-BASIC, and then QBASIC were good choices. Yes, MASM was expensive, but debug.com was free.
I think I forgot WATCOM C++. I don’t know what happened to them. By the time I got to try it, it had already failed and was released as open source. And given DOOM was written by it, and it sold more copies than Windows at that time, that throws a wrench into my equation. 🙂
I think you are right about gcc / mingw. We could also include Linux in this. Microsoft could no longer undercut the price, since open source offerings were free, in price too. They were good enough, and students moved onto Linux as their learning platform.
(I remember replacing all machines and servers in our lab with Linux. Microsoft came to our department and gave us free copies of Windows and a server machine, and later lots of free books, so that we could keep students in their ecosystem).
Free or cheap + good enough has a way of beating expensive and better in the long run.
@Malxau
The Visual Studio 2003 Command Line Compiler was available for free and was the full optimising compiler. It was easy to use it with an editor and the Windows SDK.
MASM was harder to track down but there was a free version kicking about somewhere.
That’s right, although the free version was released in 2004, and as you mention, it requires a bit of manual assembly to make it useful. In 2005 the standard edition (and express edition) moved to an optimizing compiler, and the era of no-optimizing compilers ended for good. Note that the 2003 toolkit was very minimal – no IDE means no debugger, no debug CRTs, no import library for a CRT DLL, no MFC, no 64 bit, etc. I wasn’t at MS at the time, but that looked to have been a rushed release where the 2005 express edition completed the transition to a free product.
As far as I can tell, MASM was a free download as part of the Processor Pack but only licensed for use with Visual C++ Professional, so whether that’s really “free” depends a bit on how much weight to place on licenses. Technically it wouldn’t have been hard to extract and use standalone. MASM was in the DDKs, but I think the DDKs of that era weren’t free (I remember paying for 2003 DDK media kits.) So I think it was included in Visual Studio 2002-2005, but wasn’t completely “free” until the Vista SDK in 2006.
MS tools weren’t cheap. And traditionally microsoft’s compilers have been awful and well behind standards compared to just about every other alternative. Except for their own proprietary languages, but mainly because there were not that many alternatives to compare against.
The irony is that Microsoft started as a compiler/language tool company. And yet to this day their C compiler is still shit compared to the alternatives. Even gcc produces better code on windows than cl.
The main edge Microsoft had was that they had access to the undocumented parts of their OS APIs.
“Except for their own proprietary languages, but mainly because there were not that many alternatives to compare against.”
Yes, thats where they’ve excelled. Visual Basic, C# those are the business, combined with Visual Studio.