Linked by Thom Holwerda on Thu 4th Aug 2005 14:07 UTC, submitted by sonic1001
Linux Between December 2004 and July 2005, the "defect density" in the Linux kernel has fallen from 0.17 to 0.16 and all serious defects have been corrected, a new report out from code analysis firm Coverity asserts. Defect density declined by 2.2 percent.
Permalink for comment 13671
To read all comments associated with this story, please click here.
checkers and verifiers
by butters on Thu 4th Aug 2005 19:24 UTC
butters
Member since:
2005-07-08

Coverity Prevent is a static checker. It is programmed to look for patterns and make should that they comply with particular rules. For example, a rule could be: "before accessing kernel data structure x, acquire lock v." This makes it better than more simplistic tools such as Lint, which primarily detect the big three (uninitialized variables, null pointer deferences, and out-of-bounds array accesses) plus slews of stylistic complaints.

The inherent theory behind a checker is that it can only look for what it is programmed to look for. It does not in any way provide assurance that the "defects" it finds are the only problems. This can only be done with a verifier, which uses a system of code annotation to explain to the tool what the code is doing, so that the tool can verify that the code is really doing what is intended. The problem with verification is that it requires a lot of work to annotate the code, and this process can only be done well by a developer who really knows the code.

In the end there is no substitute for human code review, but static analysis can filter quite a bit of problems beforehand, and the use of such tools as a part of the development process has also been shown to improve coding style. If the checker complains about checking for null where you are pretty sure that the pointer in question will never be null, it is easy to just put the check in and have the complaint go away. In the long run, this is good style because later on someone can change some other code that might allow for that pointer to become null.

As far as human code review goes, there are two forms: static code review and runtime functional verification. The BSDs use a more thorough static code review process than does Linux. However, Linux not only has "more eyes on the code," it has tons more real world testing going on. People are running Linux in all sorts of weird configurations in large numbers. Any problems are usually discovered and reported quickly. Coincidentally, it is usually the types of bugs rooted out by static checkers (particularly uninitialized variables) that are more likely to crop up later in the product lifetime.

A new movement to bring BSD to the desktop seems to be starting, given the relatively recent releases PC-BSD and others. This is good, because it opens up a wider audience for BSD. I hope the hardware support can catch up to Linux, or else the desktop BSD movement might be limited. Also, BSD has no catch up to Linux on low latency process scheduling and other features that make Linux appealing on the desktop.

Finally, Windows sucks for games. It just happens to be that many games are written only for Windows. DirectX sucks for developing games that perform well or have innovative features. Most of the "big titles" for PC gaming are written in OpenGL for this reason. DirectX is really only "good enough" for console gaming. With Linux, game developers can produce Live Game CDs/DVDs that boot to an environment (possibly including a tweaked kernel) specifically tuned for that game. Windows can't really do that (are there any Windows LiveCDs that can even compare to those widely available for Linux?).

Reply Score: 3