Username or EmailPassword
great article, but there are more severe problems you can find with a wattmeter (not voltmeter as stated in the article)
and some times there are simple means to solve them
adding extra insulation to a boiler can reduce power-consumption significantly
So these rates actually work ? I see why electricity companies make electricity cheaper at night, but I've always wondered if it was an effective way to make people turn power-savvy devices on at night.
Anyway, if I may add my own computer-unrelated advice... During the winter, make sure that heaters only heat when you are at home, and heat up less during the night. One actually sleeps better if the house is at 15-17°C, although you get cold mornings as a counterpart. Edited 2011-08-03 08:49 UTC
Yup, as said previously, I know that. I was only wondering if it was effective, that is if many people actually modified their habits to use electricity during the night. Especially when some companies don't make having two different rates for electricity price the default, and charge a little per month for it.
Besides, isn't the situation reversed in winter, when people turn on home electric heating at night ? (Whereas large offices tend to prefer fossil fuels for heating)
We have a washing machine with a timer function, so it starts washing at around 03:00 am.
Used to have our water heated at nigh (3 to 6 KWH). However we now use a combi-boiler which overall costs less to run (we switched due to tank and boiler failure).
Great article as usual.
Sorry if i repeat anything you already said below.
I sometimes build custom gaming pc's on order and can share a few pointers.
1.Check if your computer's PSU (power supplied unit) is 80 Plus certified.
Many pc manufacturers uses generic psu's that don’t quality under 80 Plus certification.
Check, or let someone qualified check it for you and if not 80+ certified it's a good idea to replace it. 80 Plus certified psu's is not even that expensive. It might also make your pc run quieter.
There is three 80 Plus certifications. Bronze, Silver and Gold.
2. LED screen uses less energy than LCD's
Rather use LED screen.
3. When adding a second 3.5" Sata hdd go for the "green ones".
For instance Western Digital caviar Green
Seagate barracuda Green
These drives switch off when inactive.
4.The firmware of motherboard sometimes installs a control centre with tweaking options like "eco mode" among other options.
5. AMD Cool and Quit activated via your bios and then from desktop can save a lot power. see below.
6.Hardware Aids in power usage.
Whit this product you can conveniently switch off discs in a multi disk configuration.
7. Even some Ram can come in ecoform now a days.
Kingston Hyper-X LoVo ( Green )
Our pc is many times overkill for what we use them for.
If you are only going to use spreadsheets, read email you wont need to buy a high spec pc.
Some new Mobo/cpu's unit in one has come out with lower power consumption but still respectable speed and other bells and whistles.
AMD 350 Brazos that uses lots less energy but they still have high def.,hdmi outs,Sata6G, USB3 and many other features while using much less energy.
SAPPHIRE E350M1 PURE FUSION
http://www.sapphiretech.com/presentation/product/product_index.aspx... Edited 2011-08-03 00:03 UTC
Replacing an ordinary PSU with a energy efficient one, dropped the noise level and almost halved the powerusage
(now less than 40 Watt idle) on one machine. Largest powersaving is a Mac mini (usually below 20 Watt 13 Watt when idle) but it takes a long time to recoup the Apple tax
Excellent points. Thanks for adding this useful information.
My apologies to everyone for inexplicably renaming the wattmeter/multimeter a voltmeter. Glad to see most folks overlooked this goofy error and focused on the useful points in the article.
-- Howard Fosdick
On manuals of printers i seen they recommend switching off the wireless capability when not using. Apparantly this saves a bit of power.
Ink cartridges don't last nearly as long as toner cartridges (but, admittedly, have less plastic), but in low volume printing, you'll end up replacing ink far more often (in terms of page count), due to them drying up, and you'll probably end up replacing printers far more often due to ink being more expensive than the printer.
So, overall cradle-to-grave energy consumption of a laser may be better.
Laser printers use more power, though. Still, I'm not sure which has the lowest cost of operation once you factor in the cartridges.
In a 'low usage', what matters most is the lifetime of the printer itself: I had my laser printer for 10 years and I didn't have to replace the cartdrige yet.
In the same time, my brother had to buy two inkjet printers because the first one failed after a few years, plus several ink cartdrige because they failed too (not enough usage).
Plus my year old laser actually shuts it self off after 5 minutes of inactivity. It doesn't "Sleep" it shuts off. It can't be woken up by software, a hardware button must be pressed to get it to print. I usually leave it unplugged just in case. I understand the energy implications, but inkjet is crap.
I don't often print, but when I do print, I want it to look good.
I recently replaced a 140W 20" CRT with a 14W 20" Acer LED ($100) and since that monitor is used 16hrs a day at a 15c KWhr rate, the LED pays for itself after a year or so. The CRT though just won't go and die yet. The CRT still looks way better for square TV/video, but far worse for everything else.
My oldish 24" LCD though is closer to 80W and is toasty to sit in front of. Most plain LCDs are not as frugal as they could be, typically 60W and up for 24" and bigger screens. Some of those 27" and bigger Hanspree and HP models are well over a 100W.
My next 24" will also be a 1920x1200 LCD from Lenova that uses a half power CC tube at 35W. Wonder why more CC tubes don't use that design since it compares well with LED on power.
Still LED back lighting really seems to be smacking LCDs now on power and the price difference is becoming minimal and could easily cover itself in short order. Also LED makes for much slimmer, lighter panels (and woblier too).
Also on ATX power supplies, most PCs use <80% efficient PSUs and the power factor is usually bad too. Compare the VA value against the Wattage value, VA is often 20% higher. Since most budget PCs can use a motherboard with built in graphics, they should be able to run on 60W or so esp if a 2.5" HD is used. That means the 300W PSU could easily be replaced by a micro PSU that fits entirely in the 24pin molex connector. These fanless Minibox supplies can deliver from 60W-150W saving space, noise and power and are >95% efficient. They do use an external 12V DC adapter though. Using one though will limit expansion options.
For most of my PCs, I also switched to refurbished 2.5" HDs drives leaving the 3.5" HDs only for mass storage used only when needed. Saves power, space and noise. Microcenter often has these for $15-$20 at 40-60GB size.
"LED-backlit LCDs were probably less expensive to make from the very start of their mass-production"
That is probably over stating the situation, but today the cross over point has more or less arrived for the sweet spot around 20" to 23". Today it seems CC tubes are finally going away for the commodity market.
For the higher end at the 24" to 30" panels the price difference is way more marked, perhaps the engineering is still tougher for large area edge lighting, and you also get into TN vs IPS issues and color gamut and professional use issues.
Maybe, maybe not. The screens are generally almost identical, except for the type of edge-placed back-lights* ...which should be at least very comparable in cost (especially since CCFL need a relatively fancy power source); LED having probably lower future costs of disposal (which nowadays need to be often factored-in) or greater reliability (less warranty returns)
Additionally, overall, you don't need to search very far to find plenty of examples of human irrationality, especially when it comes to purchasing dynamics. I see no particular reason why it wouldn't be the case in this field. Or at least what was in the interest on non-consumers in the equation, to clear their warehouses and supply chains.
* And even for a notably different screens, those with a LED array behind the panel, something happened around half a year ago (in my part of the woods) which made them well entrenched into the segment of ...perhaps not the cheapest possible choices, but very much "a fairly typical, average-priced TV" (with large, ~40" LCD panels; yes, in a bit different league, always topping at 1080p for one; but I see no clear reason for back-light engineering to be much different in smaller, higher DPI panels) Edited 2011-08-07 12:56 UTC
I just very recently bought myself a new laptop and noticed that it actually has two graphics cards: a low-power, low-performance one and a high-power, high-performance one. I can switch between them manually or the system can do it based on my power-profiles, though it doesn't switch the high-power one on when I e.g. start a game. I s'spose it's a shortcoming of Windows and the drivers.
Anyways, it made me wonder when will PC manufacturers actually start doing the same thing on desktop PCs. It would make sense, even if it'll add $15 to the cost straight-up it'll pay itself back pretty quickly for most people.
EDIT: As a side-note I've gotten the impression that switching graphics card on-the-fly is STILL not possible under Linux. I would think that such an ability would be very high on the to-do list, after all it does save power quite a bit, but I can't recall having seen anyone even thinking of working on that. Has this changed yet, does anyone know? Edited 2011-08-03 05:34 UTC
Have switchable graphics on my PC too, and I can attest that support in the Linux world is still in its early stages. Last time I tried, a few months ago, the computer locked up
All I want myself is something that switches off the NVidia GPU and never turns it on again, since I don't use GPU-intensive software on Linux. Wonder if that's possible already...
My PC consumes 20W when used normally.
AMD Sempron 3400+, RAM: 1.5G, 2 hard drives, GeForce 7300 SE.
It runs GNOME with metacity (no OpenGl effects). The screensaver is a black screen (no OpenGl).
It tops at 80W when playing 3D games.
The screen consumes 25W: 19' iiyama.
It beats your MacBook. My guess is that Macs are bad at energy saving, because they use the hardware accelerated effects to display the desktop. Is that right?
One thing I have noticed is that the computer consumes energy when doing computing. These days, the graphic card is more powerful than the processor. It's the unit that consumes the most power. It dwarfs all the rest. Getting green hard drive is useless when you waste 80% of your energy in the graphics card. The hard drive's consumption is not significant. Edited 2011-08-03 06:41 UTC
My computer consumes 1 to 2W when switched off!
I bought a plug switch and turn it off so it now consumes 0W when switched off.
Some people think there is a spike when the computer is turned on. There is none. If it takes 30 sec to shut down and 2 minutes to boot and you leave for 5 minutes, you will save energy if you switch it off. It's nothing like a diesel motor, not to mention that what people think about modern diesel motors is also false. Switching your motor off is also worth it.
If you are using a laptop as a desktop replacement, like mentioned in the article, and it is plugged to the wall, please, pretty please, remove the battery! You don't use it and you are charging it. There is an insane loss of energy to store it, then your battery will loose that energy over time by heat when the computer is turned off. Moreover your battery will live longer if you don't waste your limited cycles.
Most desktop PSUs have hard switches at the back. That should take care of the stand-by power consumption.
I would hope for average battery-charging software / logic to be better; to not recharge obtrusively when not really needed... (though this might involve a setting or two; still, probably less "taxing" than something so scary as manipulating a battery, for most users)
Overall, having a built-in UPS is too nice to abandon (and again, in a "desktop replacement" normally working off the mains, the battery doesn't need to be obtrusively topped-up at every opportunity)
PS. Funny thing with diesels, they actually scale exceptionally well their fuel usage down, when only fraction of their total power output is required or when idle. Yeah, people could stop being almost prejudiced about them.
Well, it does not matter how clever the software is. Charging the battery costs energy, always. If you use it, when traveling for instance, you need it. But if your laptop is always plugged, just remove it. The ratio of energy consumed vs energy stored is insanely low. And you store it for nothing, it will dissipate in heat, and not so slowly.
But that's the thing, "smart" software can be set in such scenario to essentially not use the battery
Yes, when kept in warmer environment of a laptop it will dissipate slightly faster; overall being recharged more; but nothing too dramatic. Especially versus the perks of built-in UPS (and I write it especially from the point of view of "average user" who seems to have a hard time protecting (saving data, for example) against sudden power loss, and we are supposed to "get their data back" after the fact...)
The problem is that it is a trade-off between saving electricity and having equipment that last longer. Laptop tends to have a relatively short life-time, and when something (screen, keyboard...) stop functioning, it is not easy to replace this part, especially after the end of the warranty. So people tends to buy a whole new laptop. Building and getting rid of electronic devices have great environmental costs (how much pure water to make a processor).
A CRT screen is by far less convenient than a flat panel, but the latest is not nice at all when it comes to dismantling.
It's not much of a trade-off in practice - at worst, you just use it until it fails. I think most units don't fail throughout their life, and people still get new ones (too often because "old computer started to be broken & too slow" aka just filled with crapware*)
So? "I have" / "I know someone" is not only a poor input vs. overall trends (how people don't seem to hold on to working machines often enough), you also essentially reaffirm my very own assertion that, yes, it's not something you can plan around much (please, please, drop the confirmation bias of "they don't build them like they used to" - we just remember positive specimens better, the few still working doubly so), so at worst, you just use it until it fails. Edited 2011-08-03 08:27 UTC
No, he's right. 1 out of 15 laptop fails in the first year, 1 out of 5 laptop fails before its 2nd birthday and 1 out of 3 laptop fails in 3 years. And if you have a HP laptop, you have 1 chance out of 4 it will fail in 2 years. It's not a myth. Laptops have a very poor life expectancy. I can confirm this with my experience as well.
Again, it doesn't change how they are too often replaced before hitting their limits (likewise for desktops of course; NVM how comparative data between the two classes are still missing, or how laptops are also a bit more likely to endure physical hardships, that's not the point)
It's not much of a trade-off when, too often, people aren't very willing to hold on even to working equipment.
When a laptop fails, it means it is used. It does not fail in the trashcan or in the recycling factory.
Which, again, doesn't exclude other (and IMHO way too widespread of course) scenarios, practices.
Here is one citation: www.squaretrade.com/htm/pdf/SquareTrade_laptop_reliability_1109.pdf
There has been several studies about laptops failure rates. It is confirmed by my experience but I based the numbers I cited on real studies over hundreds of thousands of laptops. Google laptop failure rate for other citations if you need. They all end up with pretty similar numbers.
Thanks for the citation. It makes everything easier to examine.
For instance: the three year failure rate is not 1/3. Its 1/5. 1/10 people accidentally break their laptop.
Also, there is a bit of potential selection bias going into the study. This looks at laptops covered by a third party warranty. So, more accurately it shows that 1/5 laptops owned by people who think their laptop may fail, fail within three years. Those who don't pay for the warranty may take better care of their laptops ( Ie not storing it in a hot car, not leaving it on 24/7, not using it to level out the kitchen table, not running the battery till death for the fun of it, using an additional cooling pad ect).
Again, I really don't have a dog in this fight, we were saying that laptops break more often or something? I'm just complaining about people's throwing around antidotes and not understanding statistics. Its of minor importance in this case, but we suck just as much when it is important... like economics, politics, and health issues.
For LCD screens, you can make a screensaver, that shows white color instead of black. LCD works in that way, that power is needed to make black color. White is the natural state of the screen, so it only uses power for backlight. And if you turn of the screen light, then it saves power completely. Try white color with a voltmeter.
With plasma and CRT screens black is the natural state of them, so showing black should save power. But those screens are eating more of it naturally.
What about LED-backlit LCD ? Aren't they smart enough to selectively turn off some of the white LEDs when displaying black ?
Ah, can't wait until we have OLED and transflective screens everywhere
I believe that most LED backlights don't actually have a lot of LEDs behind the screen (known as "full array") that can be turned on and off, but rather have LED emitters at the edges of the screen that uniformly illuminate the area behind the LCD panel.
There is a technology known as "local dimming" that works like you describe, but it's only in high end TVs at this point. I don't know if anyone is making computer monitors with local dimming.
I think you are quite wrong there.
It is a fact that the LCD screen pixel structure is only about 5% efficient at letting the light through when fully on, that means 95% of whatever the light source is either CC tube or LED is blocked and is therefore turned to black body heat. If the screen is black it would just be 99.9% instead. To save power means controlling the light source power.
There was/is a company called Unipixel that had a LCD alternative that claimed 60% efficient light cell based on a neat MEMs opto capacitive structure. They even licensed it to Samsung in 2009, not a squeek since though. When driven by side lit LEDs, its power use would be far below todays LED LCD panels which are mostly already good enough.
A voltmeter would tell you that your computer is fed with 230V AC all the time, unless you have some really bad power grid around. I suggest using a wattmeter instead, although it's open to discussion.
EDIT : Ooops, someone said it before me.
EDIT 2 : Excellent research, by the way. You've found quite a lot of references to prove your point ! Edited 2011-08-03 08:38 UTC
By the looks of the power socket, I guess that would be 110V AC ...and I didn't even need a voltmeter to assess it, remotely!
Too bad, it makes the article a bit mixed bag. On one hand, as you say, nicely sourced. But Watts / Volts mix-up (especially since the latter are essentially constant in such scenarios) casts a shadow... No, really, with such basic mistake repeated throughout, it unnecessarily makes the whole article suspect from the start; about the writing process, the author behind it, how much of it can be depended upon, etc.
(don't get me wrong, obviously quite a lot, it has useful practical guidelines; but BTW, I can't help but notice how immense portion of it is what I would hope to be common sense knowledge :/ - even bordering on "perpetuum mobile doesn't exist" - which, sadly, probably isn't common sense knowledge, hence the usefulness... though I'm not sure if OSNews is the best channel)
PS. And getting another wattmeter is not strictly required when... virtually all dwellings already have a very precise central one. Considering how rarely we would fiddle with this, disconnecting every other electrical energy sink for the time of PC experiments is fairly trivial; you can temporarily move the PC closer to it, too. Absolutely basic arithmetic will take us most of the way to per-socket one; especially since for calculations we see averaging & "assuming" anyway (of course that assumes somebody can notice, connect the dots between energy usage of each device vs. overall bill; but it's also required with portable wattmeter)
Heck, promoting such portable wattmeters when there's already a very good one (and not really very inconvenient) in every house, is itself a bit of a waste of resources, energy
Ah, you're right, totally forgotten that not every part of the world used my fellow voltage standards
Though about wattmeters, it's my turn to accuse yourself of being region-specific ^^ In France at least (should check in Sweden while I'm here), old-fashioned mechanical watt counters with a spinning thingie (1 turn = X kWh) are still common, and using them for wattage calculation is quite cumbersome and imprecise.
Such meters are also still standard at my place; I don't really see them as "quite cumbersome and imprecise" (vs. cheap portable meter), especially since for calculations we see averaging & "assuming" anyway (overall, total usage over time is what we need to have in mind, if energy conservation is to become a routine)
And might I add that, in your unsound accusations, you actually gave me an opportunity to berate you so much more strongly!
I've never even been anywhere close to a 110V place, I just remember they exist.
Are there any 110 volt zealots or 230 volt fanboys, I want to see there flamewars.
Anyway, anybody dumb enough to live in a 110 volt area deserves to have there equipment broken earlier and there have computers run slower.
230 volt is the true path to enlightenment.
Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator.
I dream of a building where there is a huge central heat pump. It cools down all fridges, computers, and freezers and on the other end distributes heat to whatever needs it. In every room, you find faucets for hot and cold fluid that you can use to heat and cool down things as needed.
(Flamewar contribution : 110V AC and 230V AC are both lame, DC current makes the chances of survival to an electric shock much higher* and is the One True Way
* An irrelevant side-effect of DC being slightly reduced power grid, generator, and transformer efficiency) Edited 2011-08-03 12:11 UTC
Wow. OSNews readers can come up with a flamewar about anything! :-)
Back in the very earliest days of electricity production, I'm sure there must have been heated debates over AC vs DC and 110-220-240 Volts and so on. The players would have included Edison, Tesla and other notable figures.
"Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator."
I had been thinking the same thing too, at least when I was using a D805 + 2 large CRTs that typically used 400W continuous. How to get the heat away from me and towards a more useful purpose. Anyways its all moot now.
Power saving is all right but I am really not enjoying this cold summer ... hence I leave my computer ON to contribute to global warming. It's a small contribution but it's all I can afford.
So where in the world is summer too cold?
In the US we have probably had enough of the heat dome effect, although in Mass we are not so affected.
You want to start a flame war on global warming seriously!
I often joke about it ... getting warmer does not motivate me to save power, quite the opposite. I do not believe in global warming being mainly caused be men and by whatever (sun for example) it is mainly caused by, I would seriously enjoy more heat from it - writing from Central Europe.
It also depends where the power comes from ... cutting on other harmful gases is OK, but CO2 is not the one we should avoid!
CO2 is healthy for the vegetation, all the plants rely on it so what the hell? I am all for saving the planet, not polluting environment, not using rivers for waste dumps, saving endangered species etc... but CO2 is just silly. Even if global warming was harmful (I don't know), CO2 is a very minor greenhouse effect gas, and human produced CO2 is even less significant.
Thanks for bringing up other effects of CO2 increase. I do not believe in it contributing to the global temperature in a serious matter (quite the opposite, temperature increase, increases CO2 natural production) but it might affect life in other ways, possibly harmful as you have pointed out.
http://dangerousintersection.org/wp-content/uploads/2006/09/CO2-Tem... This is often pointed out as a proof of CO2/temperature relationship. Careful examination shows that it's the CO2 that lags after temperature. Edited 2011-08-03 22:21 UTC
Your are spewing out the usual climate denial junk we expect to see from the wattsupwiththat crowd as well as the various conservative think tanks funded by Exxonmobile,the Koch brothers or the tobacco industry at the Heartland Institute. Are you a stool for those interests or can you study think for yourself.
If what you say is true, write a paper on all your beliefs and get it published in a peer reviewed journal. You will find your points have all been debunked.
It is one thing to be a conservative/republican, it is quite another to buy into the idea that conservatives automatically have to work for free for Koch industries to promote coal and fossil fuel burning.
It is a simple fact humans have increased CO2 levels from around 250 to 460 ppm since the industrial revolution started, nature can not do so that fast. Since China and India have joined the energy party, we are likely headed to 700 ppm in the next century with no stopping. Last time CO2 levels were this high due to nature, was eons ago when life and flora was very different.
And yes CO2 is good for vegetation up to a point. It is also a trace gas, and physics says CO2 is a warming gas although water vapor and methane are much worse. The Oxygen, Nitrogen, Argon that make up the bulk of the atmosphere are not warming gasses so indeed the remaining gasses can and do make a huge difference.
The CO2 normally cycles through the system over very long periods of time, about the same amount goes into the atmosphere as is taken out by natural processes. The human load is just a small push of a few percent but it is always adding, so the CO2 level drifts upwards, that is simple integration math.
I could go on but you could learn more from your own research.
Personally I think that the only way to a future of guilt free plentiful energy for all the nations is Nuclear power from Thorium LFTR, despite all the green loons anger at Nuclear. Nuclear energy is millions of times energy denser compared to solar in any form and there is enough Th for the entire planet to live well for thousands of years until fusion works. It can even help rid the world of the nuclear waste from weapons and regular nuclear plants.
google kirk sorenson thorium energy
You obviously don't know any chemistry because the treat of acidification is exactly zero. It is literally impossible:
- pH is logartithmic. In other words a pH of 6 is 10x as acidic as a pH of 7.
- Carbonic acid (dissolved CO2) is weak acid. A 30% increase in dissolved CO2 has negligible impact on the pH of seawater.
- pH is highly temperature sensitive. The pH of seawater changes far more in due to temperature changes than atmospheric CO2 levels.
- seawater cannot become acidified (or significantly less basic) by CO2 because it is heavily buffered by dissolved salts.
- Ocean acidification has never occurred even with CO2 concentrations 20x as high as present.
On the desktop, linux consumes 100 times less energy than windows and 5 times less than MacOS.
Gentoo consumes 1000 times less energy than Ubuntu.
Sarcasm failure detected.
Well, the math is really simple.
A linux desktop consumes roughly the same amount of energy as a windows desktop. There are 100 times more windows desktops than linux desktops. Therefore windows consumes 100 times more power than linux on the desktop.
The same math applies when comparing gentoo to ubuntu.
I thought you would get it.
I guess 650 kWh for 2 months would be more accurate for me.. and that includes heating and cooking.
When someone has to get new electrical wiring in the house I can give you one advise: make sure you can switch off most power groups/outlets from the main board. In my case it will be less than 10 years before I saved the initial cost, even when I only take the kitchen into account.
I do plan to build a new desktop, but that one has to be power efficient and should last at least 6-7 years. It's main purpose will be for more demanding programs like editing photos, running virtual machines, .. When I want to surf, chat, mail I will still use a cheap laptop.
Like others have said what you want is a wattmeter not a voltmeter, no question about that.
The device you bought, judging by the name, is a wattmeter so you got the device right .
Another thing is that any device will use power when turned off _unless_ there is a mechanical switch to completely turn it off, this also applies to hibernation mode.
When it comes to real devices the DOE definition doesn't mean much, how much each device uses when on (soft) turned off cannot be predicted, it can be less than 1 watt or reach several watt, you need to measure it to find out how much each device uses.
Wow. I nearly cried. It's difficult enough to explain the concept of energy and power to people without confusing them with unit abuse.
Energy is measured in Joules. The rate you use energy is Power and is measured in Joules per second, or Watts. People don't like multiplying the wattage of their equipment by the number of seconds it's on (to give Joules) as the numbers are big, so they multiply the wattage by the number of hours, to give kilowatt-hours. That's not kilowatts per hour, that's kilowatt hours (like pound-feet, or Newton-metres). One kilowatt-hour is 3.6 Megajoules.
Your energy company is only concerned how much energy you have used (in Joules, or Kilowatt-hours), not the rate at which you use it (in watts).
Don't feel too bad. Even eon (the large UK utility company) make mistakes. See the Scroby Sands Wind Farm display in Great Yarmouth, England talking about the number of homes per year that 60 Megawatts can supply.
That meter is a multimeter. It shows more than volts.
LCD's can use as much as a CRT!!! Some LCD's use almost 200watts. Look for led lcd's that use no more than 30 watts active.
I monitor my energy use. Most of the older wall warts use 3 watts all the time. If you have ten in your home that is 30 watts all the time. Like putting a 700 watt bulb on for an hour each day. Who would do that?
Put every item in your system on a power strip. When you are not at the computer turn it all off and use the power strip switch. Power strips also help prevent voltage spikes and in some cases lightning. Edited 2011-08-03 16:17 UTC
"LCD's can use as much as a CRT!!! Some LCD's use almost 200watts"
While shopping for a new large panel, I couldn't help notice that too. The Hanspree 28" panel that gives us the 16:10 1920x1200 for $250 was pretty appealing except that it sucks like 110W or so. Two of those would kill desk space as well. The Apple 30" IPS panel also use 180W IIRC but price is beyond my budget and the resolution is too high for older eyes. I think the IPS panels also need more source light since the IPS switch is less efficient. LG make some 22"-23" models with the option of IPS vs TN for very small difference in $ and Watts.
Having said that, you have to use a Watt meter to see if the specs are true or just over stated. When I bought my Panasonic Plasma TV the specs suggest upwards of 300W in use. I checked it in the store with a Watt meter (the first time anyone ever did that) and it was half that, and at home it was around 100W.
Also something most people are not considering is the VA rating rather than Watt level. Most appliances use more VA than Watts, we pay for Watts in KWhr charges but the utility must produce VAs about 20% more. That means they have to balance the phase by over producing power.
LED night lights and a lot of DC powered device use really crappy AC-DC circuitry that use far more power than the DC rating suggests. Set your meters to VA to see the difference. The industry really needs to push harder for power factor of 1 so VA equals Watts. That requires better quality switchers, some PC PSUs do have power factor correction in them, most don't.
When I search "voltmeter" and "wattmeter" at Amazon I get about the same list of devices in the results. Most of them are multimeters with the ability to provide several different kinds of output. You saved yourself from confusing everyone by including the picture. BTW, the Kill-A-Watt you show provides several kinds of outputs (as is typical) and will calculate the kilowatt-hours for you without need for manually cranking through your formula.
How to Save Energy When Using Your Computer
1. Lay down and type with one hand.
2. Never, ever read the linked articles.
3. Use a Tandy Model 100 laptop, dialed in with a 300 baud modem to your ISP, surfing the web with Lynx.
4. Do not turn your computer on, just stare at the pretty reflection in the monitor.
5. Use binocculars to watch your neighbor surf the web (or so you say).
6. Go wireless - just imagine what the web pages would look like in your mind.
7. Pick up a land-line phone and imitate a modem. If you're good, you can connect.
8. e = mc^2 - not sure how this helps, thought I would throw it out there,
9. Stop frowning when Thom posts another Software Patent article. EVERYONE knows you use less muscles when you smile :}
10. Don't thiink too hard about what you post!
Some what off topic, I have recently gotten really sick of seeing the usual black PC cases all around my house, when in reality they were all mostly empty. They were all made from cheap upgrade parts though so the cases were just getting recycled as the insides got ever sparser. Mostly Sempron systems needing 60W or so. They all had noisy fixed speed PSU fans and stock CPU coolers and were dust bunny collectors.
At about 16" x 18" x 8" the volume is unsightly in regular rooms so I built my own wooden cases using spare floor laminate and some skilled use of the table saw. These are about 10" by 9.5" by 6" and are exactly 1/4 of the volume of the metal beasts. They are just big enough to throw in most cheap mobos with a stripped down PSU and a 2.5" HD. The stock coolers got replaced by $10 heap pipe tall stack coolers which allows for very slow quiet fans. A second fan cools the PSU, both fans are on a regulator set near minimum. They still need more detail work and I doubt the FCC would be pleased.
This still is not satisfactory because even these boxes are mostly empty although stuffed with excess PSU wiring. What I have in mind for the next phase is to mount the mobo directly on the back of the LCD panels with VESA mount bolts and use a miniBox type PSU switcher set into the 24pin connector. The tall heat pipe coolers that now sit on the CPU now stick out. It would make the LCD/PC look more like an old TV with the backwards pyramid.
What I wish for now is to find a way to retain the tall heat pipe stack technology but flatten it over the mobo surface so the whole thing can be packaged in a slim book like package of about 10" square and maybe 2" thick. I would like to make or buy a heatpipe cooler integrated with a flat plate ribbed heat sink. There are still the issues of video cables, can you even get a VGA/DVI cable of 1 foot. All of this is an effort to make the PCs disappear behind the display and also get rid of the wiring tangle.
Of course I could just buy all in one PCs or laptops or iPads but that wouldn't be any fun and those have serious other issues. All of those have displays that are way too small and use laptop technology.
Its just a hobby though, looking for more is less.
If you have a website, link to this article.
Every computer user should read it.
If everyone followed its recommendations we would save a lot of energy (at little cost).
I watched a show about C02.
They said an average of a few hundred thousand years was a number like 225 and the highest recorded (if you believe their assumptions) was like 380.
Today's readings are like 590 and moving up. I can't imagine how that can be good.
buy a Mac
Your articles--while based on interesting concepts (reusing old machines, extracting every ounce of usefulness out of computer hardware, electronics disposal, energy saving, etc.)--always seem to have "WTF?!" moments that make me question what the hell you're talking about.
I could probably question other things in this article (I've stopped reading them in their entirety a while ago), but I'll just say this: I would use a pen and notebook paper before I'd ever use or recommend anyone to use an inkjet printer again. They are garbage, laser is the way to go.
Toner doesn't need "cleaned" after a week of no use/printing, requiring a new set of cartridges every month because almost all of the ink is lost while cleaning. Laser just works with minimal problems, and if you rarely print anyway, it's not like you'll be using loads of energy anyway. And if you're afraid to use too much energy--there's always the "On/Off" switch.
Simply put, inkjet is a horrible recommendation, even for those people who for whatever reason are so paranoid about their energy use; what they "save" in electricity will be eclipsed several times by the cost of regularly buying replacement cartridges. It's 2011, laser printer prices have gone way down--how could anyone even consider recommending inkjet?