Linked by Thom Holwerda on Thu 29th Dec 2005 23:07 UTC, submitted by Valour
Hardware, Embedded Systems "Over the past several years of computer hardware engineering, raw speed has been the primary goal of hardware manufacturers. This has traditionally come at the expense of power consumption, which has skyrocketed since the first days of the x86-compatible home PC. Just how much electricity does a computer and its related devices use? Are there disadvantages to turning everything off when you're done? This article will give you an insight into computer power usage."
Permalink for comment 79745
To read all comments associated with this story, please click here.
Member since:

Since my KW/hr rate jumped 10c to 13c I've been paying much more attention to my bills & my PC network consumption.

If your heating in winter is set by thermostat, then you're likely trading some heating between your home heating system with your unintended heating appliances, especially PCs, CRTs, TVs, incandescant lights. So you're monthly heating bill shifts a few dollars from oil/gas/wood etc to electric. So its probably moot in winter.

However if you use A/C extensively in summer, its a double wammy since your A/C has to work extra hard to remove that extra waste heat.

Before doing anything with your PCs, simply replacing multiple incandescant bulbs (~20% eff) with fluorescant substitutes (~90% eff) probably gives the quickest easiest savings year round. It maybe more significant than switching CRTs to LCDs but your math will vary.
For CRTs, I use screen blankers instead of savers and use power standby more aggressively than before.


Reply Score: 0