Take Apple for example. The company has gone through a number of radical moves, but all of them were actually necessary for the company to advance and remain viable. The classic Mac OS was a mess compared to its competitors, and as such, Apple needed to make a radical move in order to remain relevant: they created Mac OS X.
Moving forward a few years, Apple's lower-end Macintosh offerings were heavily underpowered and overpriced because of IBM's and Freescale's lack of interest in developing desktop PowerPC chips. Apple's laptops were running behind the competition, and even the company's high end machines were too expensive for what they offered. They had to make a radical move in order to remain viable, and so they moved to Intel. Apple's machines became not only more powerful, but also a lot cheaper.
Microsoft has also went through a number of radical moves, but they were generally less obvious, or they took more time. They moved from Windows 9x to Windows NT, an operating system they had written from scratch. Another, less obvious radical move is the interface overhaul between Office 2003 and Office 2007. Both of these were necessities, as the 'old' products (respectively Windows 9x and the default menu-driven interface) simply couldn't handle the demands of the modern times.
All of the above moves were successful because they were needed. So, does Microsoft really need to open source Windows, as eWeek's Jason Brooks suggests? Or is it more of a wish? What problem does it solve? Is Microsoft's situation dire enough to warrant such a move, probably the most radical move in technology, in like, ever?
Vista was a marketing and technical disaster, but still, it didn't hurt the company that much, and with Windows 7 being so well-received in the media, I don't think Microsoft should be so worried as to consider open sourcing Windows.
For now, at least.