Vista has poor security:
My approach to rebutting the security claims is to explain each of the vulnerabilities in detail. I hope to show that these bugs are rather complicated issues that are inevitable in any large software project. They have nothing in particular to do with Vista's design and everything to do with its sheer size. Bugs like this exist in any OS, and no doubt
SJVN seems to misunderstand the ANI vulnerability a bit. He thinks that it is in some "Program" of some sort that is vulnerable. In fact, it's a component of the Windowing System that is loaded into every process (user32.dll). It doesn't make much of a difference, because it is indeed a bug. After reading the real vulnerability report (which is two layers away from the article), one sees that it's not some sort of design flaw but merely a small coding mistake (improperly validating a size field when opening a file). It's kind of sad that they didn't fix it right in the first patch, so I hope the tester responsible for investigating that code fix gets "reassigned." On the other hand, this is not kernel-mode code and it's not running in a system or trusted context. Sure, it's bad, but that's what patches are for. Unfortunately, this was an unlucky case that passed through the stack overflow protections, but Vista is less vulnerable to this under IE because any malicious code will have difficulty escaping from the IE protection context.
There is only ONE bug in CSRSS, as far as I can tell from SJVN's link, not three. This was that infamous MessageBox bug. The issue is a double-free of allocated memory in NtRaiseHardError or one of its called functions. The function has several paths of execution for dealing with error strings from the Virtual DOS Machine or from the kernel. It's a little complicated by the fact that Windows may have to extract the error string from the memory of the process and move it to a different context in the case of the VDM. There's a good write up at eEye Security. Once again, code bug and not design flaw. It's pretty easy to see how this bug arose, because of some unexpected interaction between vastly different levels of abstraction within the Window Manager. The HardError function is used by deep parts of the OS to surface system errors and it just so happens that this is a convenient way to also display System-Modal error messages from service processes. How many times do you see System-Modal error boxes? This is just one of those crusty old areas of Windows that few people bang on, so rare and weird errors don't really get caught. You've gotta have this stuff (I can hardly call an error reporting mechanism 'bloat'), but you really don't want to use it often.
SJVN seems to think that Microsoft should have rewritten every single part of the OS. Not only would that be a colossal waste of time, but it would likely create many more security bugs than we've seen so far. There have been major changes in Vista: huge portions of the Kernel have been changed and rather massive features have been added. The graphics model is in the process of being entirely replaced (Aero and WPF are just halfway to where Windows is eventually going). There's no real reason to rewrite already working stuff unless its design is bad or outdated. So far, we haven't seen design flaws. Code bugs will always happen and they'll eventually be caught and fixed. The only clear message I have picked up from reading SJVN's security rants is that he does not have a technical understanding of that which he complains about.
Security is a very difficult problem, however, and I'd like to explain why we cannot expect absolute freedom from exploits. Even in extremely life-critical engineering endeavors, like airplane design or even the space program, the cost of elimination all defects is too high. In these fields, every effort is made to get rid of problems through simulation, ground testing, flight testing, and general design checking. Even so, bugs slip through the cracks and aeronautical engineers have a process for managing faults. The Space Shuttle has auxiliary systems; airplanes have large margins of safety so that they can be landed even when missing half their engines or when some of their avionics fail. Also, the products of these engineering efforts are constantly maintained and checked for damage and wear. Any flaws that are later discovered are patched and if they are severe enough, the subsystem is redesigned and replaced on all instances of the product.
No one is willing to pay enough for software to justify the costs of extremely stringent engineering. We could not have had the Computing Revolution if all personal computers had to be reliable enough to be used in a life support context. Moreover, the pluggable and configurable nature of computers makes it impossible to test every complete system. And in a real engineering discipline, if something is not tested it should be assumed to be broken.
So we've now established that bugs will happen. Most bugs in an OS don't have any really nasty effect, since OSes are designed with redundancy in mind (processes crash and get restarted, errors occur and are silently ignored, etc). On the other hand, even a single security bug has huge effects. You can't really build too much redundancy around the issue... the best you can do is try to isolate the flawed code and prevent compromises from spreading to the rest of the system. This is also something that other engineering disciplines don't have to deal with so much. Returning to Aeronautical engineering, they don't even attempt to make planes that are safe under both normal conditions and under the bombardment of rockets.