Linked by Thom Holwerda on Mon 22nd Oct 2007 13:48 UTC
Windows Earlier today, OSNews ran a story on a presentation held by Microsoft's Eric Traut, the man responsible for the 200 or so kernel and virtualisation engineers working at the company. Eric Traut is also the man who wrote the binary translation engine for in the earlier PowerPC versions of VirtualPC (interestingly, this engine is now used to run XBox 1 [x86] games on the XBox 360 [PowerPC]) - in other words, he knows what he is talking about when it comes to kernel engineering and virtualisation. His presentation was a very interesting thing to watch, and it offered a little bit more insight into Windows 7, the codename for the successor to Windows Vista, planned for 2010.
Thread beginning with comment 279860
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: This isnt new
by n4cer on Mon 22nd Oct 2007 16:22 UTC in reply to "RE: This isnt new"
n4cer
Member since:
2005-07-06

So yes, it's high time Microsoft cut the cruft and started a new code base, and designed the code base to be more modular, maintainable, secure, etc. It's the only way the software will survive another generation (e.g. Windows 7 and Windows 8). Otherwise, it will collapse under its own weight.


In large part, Vista is the beginning of the new code base. Again, MinWin isn't new to Seven. It's there in Vista/Server 2008. A lot of code was rewritten for Vista. They've started to virtualize system resources, and they've mapped/eliminated most dependencies and layering violations, and turned each feature into manifest-backed compoents. They are more agile in what they can add/remove without affecting other components because of this work and the processes put in place during Vista's development.

They aren't going to throw out all of that work in Seven. They're going to build upon it. I expect there will be a greater shift towards updated versions of the managed code services they've added in Vista as the preferred method for application development. I also believe they'll start to integrate application virtualization for legacy compatibility as well as driver virtualization for reliability, but the end product will be the offspring of Vista/Server 2008, not an all-new code base. I wouldn't expect something that big for another 1 or 2 major releases.

Reply Parent Score: 2

RE[3]: This isnt new
by Weeman on Mon 22nd Oct 2007 19:56 in reply to "RE[2]: This isnt new"
Weeman Member since:
2006-03-20

and turned each feature into manifest-backed compoents

About that...

Have you ever taken a look at WindowsPackages or wherever they're stored? All it is, is a manifest of bloat.

Reply Parent Score: 2

RE[4]: This isnt new
by TemporalBeing on Mon 22nd Oct 2007 20:09 in reply to "RE[2]: This isnt new"
TemporalBeing Member since:
2007-08-22

In large part, Vista is the beginning of the new code base. Again, MinWin isn't new to Seven. It's there in Vista/Server 2008. A lot of code was rewritten for Vista. They've started to virtualize system resources, and they've mapped/eliminated most dependencies and layering violations, and turned each feature into manifest-backed compoents. They are more agile in what they can add/remove without affecting other components because of this work and the processes put in place during Vista's development.

It isn't a matter of how agile the code is. It's a matter of how much the code itself can take change. Windows, due to quite a lot of reasons (e.g. backward compatibility, competition stifling, incomplete and undocumented APIs, bugs, etc.), is a monolithic code base that is not very easy to change. Revising it, refactoring it is not going to help. The only way you solve that is by starting over.

Starting over is often good for a project too. You lose a lot of legacy code that is not needed, and you get the chance to do it better, more correctly. You can apply newer design and architectural principles and fix things proactively instead of retroactively. (Sure you'll still have stuff to fix retroactively, but they'll be different things than before if you did your job right.)

Every software project will at some point reach a point where it'll have to have its entire code base thrown out and restarted. In many respects, it is really a sign of maturity of the program - you understand the program enough to know how to do it right and you need to give yourself the opportunity to do it. A clean cut is often the only way to do so.

Vista is better in some respects to modularity of parts. However, it is still far from what it needs to be and it has a lot of cruft in it - stuff Microsoft simply can't get rid of unless they start over. Otherwise, they're just continuing in the same paradigm, fixing the same issues over and over.

Reply Parent Score: 1

RE[5]: This isnt new
by joshv on Wed 24th Oct 2007 04:38 in reply to "RE[4]: This isnt new"
joshv Member since:
2006-03-18

"It isn't a matter of how agile the code is. It's a matter of how much the code itself can take change. Windows, due to quite a lot of reasons (e.g. backward compatibility, competition stifling, incomplete and undocumented APIs, bugs, etc.), is a monolithic code base that is not very easy to change. Revising it, refactoring it is not going to help. The only way you solve that is by starting over. "

The very fact of Microsoft's existence, and spectacular stock valuation proves this point utterly and completely false. They've made built an extremely successful business around never starting over from square one.

The past few decades are littered with the carcasses of companies that were stupid enough to think they could start from scratch. In the mean time, Microsoft acquired code they didn't have, and incrementally improved the code they did. We've come from DOS, all the way to Vista, and at no point along the way did MS ever start from scratch. I don't expect them to any time soon.

Reply Parent Score: 1