Craig Barth is the CTO of Devil Mountain Software, which is a company making performance monitoring software for Windows. Users can opt to share this information with the company so they can gather data on what's going on with users' Windows machines. Barth published a blog post a few days ago, in which it was claimed that "8 in 10 Windows 7 systems monitored by the exo.performance.network are running alarmingly low on physical memory".
When I stumbled upon this news via another site (can't recall which, it was two days ago), I thought to myself that either these guys have uncovered a massive bug in Windows 7 that no one else has yet encountered, or they are just reading the "free" memory statistic in Windows 7 and be done with it.
Both options seemed highly unlikely to me; Windows 7 has been out for ages, and if some random company can uncover such a massive problem through 3rd party software, than Microsoft most certainly would've uncovered it using their far more sophisticated and more detailed analytics info you can opt-in to send to Redmond. Option 2 seemed even more unlikely; you'd have to be hilariously ignorant to base memory consumption figures in Windows on the "free memory" statistic.
Just in case we have newcomers here at OSNews, or people unfamiliar with Windows: Windows has something called SuperFetch, a technology that learns what programs you use and when you use them, and then uses that information to pre-load them into memory so that they launch faster. The more you use your computer, the faster your programs will load up - that is, if you don't load it with other crap overtime like spyware or whatever.
SuperFetch can indeed speed up booting your computer and launching applications as several benchmarks have proven, but a consequence of this is that you simply cannot rely on the "free" memory statistic - in fact, you should add up "cached" and "free", because there is no penalty for using cached memory; it frees up instantly when called upon, as if it were free memory. That's why, in Windows 7, Microsoft has changed the value from "cached" to "available". Technically less accurate than "cached", but I can understand why they went for "available".
It's mostly a philosophical debate though: should a computer cache as much stuff as possible, or leave your RAM sticks entirely void of any data?
In any case, SuperFetch is pretty basic Windows stuff, and you'd think someone called CTO would know this - especially if you're the CTO of a company making performance monitoring software. Apparently, to my sincere astonishment, he didn't, causing the confusion cited earlier.
And now the fun begins: Ars Technica's Peter Bright decided to test the software in question, and he indeed confirmed what many already suspected: they simply don't account for caching, leading to the crazy figures. In response to Bright's thorough debunking, the company decided to - get this - publish the usage information of Peter Bright's computer. Yes, you're reading that right.
"I presume the company reserves the right to do whatever it likes with the data it collects. I didn't bother reading the EULA," Bright writes in the Ars comment section, "I do agree that it was a little surprising, but I can't say I care."
Barth's "rebuttal" further confirms that he seems to thoroughly misunderstand the issue at hand. He's claiming the company's software looks at a variety of metrics to come to its conclusions, but none of these metrics take caching into account. The Committed Bytes metric, for instance, is the amount of bytes in memory with stuff in it - whether those bytes are cached (and thus, available) or not is of no concern to this metric.
So, let me give all of you a piece of advice. If you're running Windows, please check your installed programs list for the XPnet performance monitoring tool. If you have it, you run the risk of having your usage data publicised on the web, linked to your personal data such as your name. As such, it is strongly advisable to uninstall this tool, and stay the heck away from any software related to it.