Linked by Howard Fosdick on Tue 2nd Aug 2011 22:18 UTC
Editorial Your computer is an important energy consumer in your home. Can you save energy when using it? This article offers a few tips.
Order by: Score:
Comment by smashIt
by smashIt on Tue 2nd Aug 2011 23:33 UTC
smashIt
Member since:
2005-07-06

great article, but there are more severe problems you can find with a wattmeter (not voltmeter as stated in the article)

and some times there are simple means to solve them

for instance:
adding extra insulation to a boiler can reduce power-consumption significantly

Reply Score: 3

RE: Comment by smashIt
by kaiwai on Wed 3rd Aug 2011 00:28 UTC in reply to "Comment by smashIt"
kaiwai Member since:
2005-07-06

great article, but there are more severe problems you can find with a wattmeter (not voltmeter as stated in the article)

and some times there are simple means to solve them

for instance:
adding extra insulation to a boiler can reduce power-consumption significantly


True, and use night-store rates so that you're not heating up water during the day when the peak power prices are high; that combined with a good insulation around the broiler has helped me save a few dollars each month.

Reply Score: 2

RE[2]: Comment by smashIt
by Neolander on Wed 3rd Aug 2011 08:49 UTC in reply to "RE: Comment by smashIt"
Neolander Member since:
2010-03-08

So these rates actually work ? ;) I see why electricity companies make electricity cheaper at night, but I've always wondered if it was an effective way to make people turn power-savvy devices on at night.

Anyway, if I may add my own computer-unrelated advice... During the winter, make sure that heaters only heat when you are at home, and heat up less during the night. One actually sleeps better if the house is at 15-17°C, although you get cold mornings as a counterpart.

Edited 2011-08-03 08:49 UTC

Reply Score: 1

RE[3]: Comment by smashIt
by kaiwai on Wed 3rd Aug 2011 14:27 UTC in reply to "RE[2]: Comment by smashIt"
kaiwai Member since:
2005-07-06

So these rates actually work ? ;) I see why electricity companies make electricity cheaper at night, but I've always wondered if it was an effective way to make people turn power-savvy devices on at night.

Anyway, if I may add my own computer-unrelated advice... During the winter, make sure that heaters only heat when you are at home, and heat up less during the night. One actually sleeps better if the house is at 15-17°C, although you get cold mornings as a counterpart.


Because the peak power usage is during the day in much the same way that you have public transportation - during the peak time you have all the generation happening but at night you have he generators standing idle which is money wasted so to encourage power usage at night the price is cheaper. In terms of night store rates, most of the power companies in NZ give you two options: a flat rate which is the same price regardless of the time and the second option being night store.

Interesting enough there are quite a number of companies that operate their power hungry equipment at night as to save money IIRC Comalco in NZ (aluminium smelter) do a lot of their stuff at the ungodly hours of the night.

Edited 2011-08-03 14:28 UTC

Reply Score: 2

RE[4]: Comment by smashIt
by Neolander on Wed 3rd Aug 2011 14:40 UTC in reply to "RE[3]: Comment by smashIt"
Neolander Member since:
2010-03-08

Yup, as said previously, I know that. I was only wondering if it was effective, that is if many people actually modified their habits to use electricity during the night. Especially when some companies don't make having two different rates for electricity price the default, and charge a little per month for it.

Besides, isn't the situation reversed in winter, when people turn on home electric heating at night ? (Whereas large offices tend to prefer fossil fuels for heating)

Reply Score: 1

RE[5]: Comment by smashIt
by aaronb on Thu 4th Aug 2011 20:36 UTC in reply to "RE[4]: Comment by smashIt"
aaronb Member since:
2005-07-06

We have a washing machine with a timer function, so it starts washing at around 03:00 am.

Used to have our water heated at nigh (3 to 6 KWH). However we now use a combi-boiler which overall costs less to run (we switched due to tank and boiler failure).

Reply Score: 2

Comment by fran
by fran on Tue 2nd Aug 2011 23:51 UTC
fran
Member since:
2010-08-06

Howard,
Great article as usual.
Sorry if i repeat anything you already said below.

I sometimes build custom gaming pc's on order and can share a few pointers.

1.Check if your computer's PSU (power supplied unit) is 80 Plus certified.
Many pc manufacturers uses generic psu's that don’t quality under 80 Plus certification.
Check, or let someone qualified check it for you and if not 80+ certified it's a good idea to replace it. 80 Plus certified psu's is not even that expensive. It might also make your pc run quieter.
There is three 80 Plus certifications. Bronze, Silver and Gold.

2. LED screen uses less energy than LCD's
Rather use LED screen.

3. When adding a second 3.5" Sata hdd go for the "green ones".
For instance Western Digital caviar Green
Seagate barracuda Green
These drives switch off when inactive.

4.The firmware of motherboard sometimes installs a control centre with tweaking options like "eco mode" among other options.

5. AMD Cool and Quit activated via your bios and then from desktop can save a lot power. see below.

http://www.amd.com/us/products/technologies/cool-n-quiet/Pages/cool...

6.Hardware Aids in power usage.

ZALMAN ZM-PCM1
http://www.zalman.com/ENG/product/Product_Read.asp?Idx=417

ZALMAN ZM-MFC3
http://www.zalman.com/ENG/product/Product_Read.asp?Idx=341

Whit this product you can conveniently switch off discs in a multi disk configuration.
LIAN-LI BZ-H06
http://www.lian-li.com.tw/v2/en/product/product06.php?pr_index=487&...

7. Even some Ram can come in ecoform now a days.
Kingston Hyper-X LoVo ( Green )
http://www.ec.kingston.com/ecom/configurator_new/PartsInfo.asp?root...


And lastly.
Our pc is many times overkill for what we use them for.
If you are only going to use spreadsheets, read email you wont need to buy a high spec pc.

Some new Mobo/cpu's unit in one has come out with lower power consumption but still respectable speed and other bells and whistles.
For instance.
AMD 350 Brazos that uses lots less energy but they still have high def.,hdmi outs,Sata6G, USB3 and many other features while using much less energy.

SAPPHIRE E350M1 PURE FUSION
http://www.sapphiretech.com/presentation/product/product_index.aspx...

Edited 2011-08-03 00:03 UTC

Reply Score: 8

RE: Comment by fran
by gfx1 on Wed 3rd Aug 2011 13:13 UTC in reply to "Comment by fran"
gfx1 Member since:
2006-01-20

Replacing an ordinary PSU with a energy efficient one, dropped the noise level and almost halved the powerusage
(now less than 40 Watt idle) on one machine. Largest powersaving is a Mac mini (usually below 20 Watt 13 Watt when idle) but it takes a long time to recoup the Apple tax ;)

Reply Score: 1

RE: Comment by fran
by howardfci on Wed 3rd Aug 2011 22:25 UTC in reply to "Comment by fran"
howardfci Member since:
2011-06-04

Excellent points. Thanks for adding this useful information.

My apologies to everyone for inexplicably renaming the wattmeter/multimeter a voltmeter. Glad to see most folks overlooked this goofy error and focused on the useful points in the article.

-- Howard Fosdick

Reply Score: 1

wireless printer setting
by fran on Wed 3rd Aug 2011 00:24 UTC
fran
Member since:
2010-08-06

On manuals of printers i seen they recommend switching off the wireless capability when not using. Apparantly this saves a bit of power.

Reply Score: 4

bhtooefr
Member since:
2009-02-19

Ink cartridges don't last nearly as long as toner cartridges (but, admittedly, have less plastic), but in low volume printing, you'll end up replacing ink far more often (in terms of page count), due to them drying up, and you'll probably end up replacing printers far more often due to ink being more expensive than the printer.

So, overall cradle-to-grave energy consumption of a laser may be better.

Reply Score: 7

Johann Chua Member since:
2005-07-22

Laser printers use more power, though. Still, I'm not sure which has the lowest cost of operation once you factor in the cartridges.

Reply Score: 3

renox Member since:
2005-07-06

In a 'low usage', what matters most is the lifetime of the printer itself: I had my laser printer for 10 years and I didn't have to replace the cartdrige yet.

In the same time, my brother had to buy two inkjet printers because the first one failed after a few years, plus several ink cartdrige because they failed too (not enough usage).

Reply Score: 2

Bill Shooter of Bul Member since:
2006-07-14

Plus my year old laser actually shuts it self off after 5 minutes of inactivity. It doesn't "Sleep" it shuts off. It can't be woken up by software, a hardware button must be pressed to get it to print. I usually leave it unplugged just in case. I understand the energy implications, but inkjet is crap.

I don't often print, but when I do print, I want it to look good.

Reply Score: 2

Morgan Member since:
2005-06-29

Plus my year old laser actually shuts it self off after 5 minutes of inactivity. It doesn't "Sleep" it shuts off. It can't be woken up by software, a hardware button must be pressed to get it to print. I usually leave it unplugged just in case. I understand the energy implications, but inkjet is crap.


I had an old laser printer that went into a deep-sleep mode that also required a hardware button to be pressed to wake it up. It was faster than waking up from a power cycle, and it only consumed about half a watt while asleep if I remember correctly.

These days -- especially for us alternative OS users -- I would suggest a Brother laser printer for several reasons. They tend to be energy efficient, and they have the cheapest toner cartridges by far (they don't chip their cartridges so you can get generics and refill kits super cheap). They pretty much all use Postscript and PCL, and they are among the least expensive laser printers out there. No, I don't work for them, but I've had a lot of experience with them at my part time job and I absolutely love them!

Here's a neat trick with Brother printer cartridges: As I said above they don't use a chip to tell the printer when the cartridge is empty, rather they use a gear on the side of the cartridge. Once the gear has rotated 180 degrees from printing all those pages, the printer senses it and tells you to replace it. You can buy a toner refill kit for less than $10, pop off the plastic cap on the side of the cartridge and refill it, then take a Phillips screwdriver and remove the gear cover. Turn the gear back to the original position and replace the cover. You now have, according to the printer, a new cartridge.

You can even do this with the starter cartridge, the only difference is the starter doesn't have all the gears necessary to reset, so you have to buy a $5 gear kit for it. Once you've installed the kit, it becomes a standard cartridge and holds just as much toner as a retail unit.


I don't often print, but when I do print, I want it to look good.


You're the most interesting printer user in the world! ;)

Reply Score: 2

More
by transputer_guy on Wed 3rd Aug 2011 02:21 UTC
transputer_guy
Member since:
2005-07-08

I recently replaced a 140W 20" CRT with a 14W 20" Acer LED ($100) and since that monitor is used 16hrs a day at a 15c KWhr rate, the LED pays for itself after a year or so. The CRT though just won't go and die yet. The CRT still looks way better for square TV/video, but far worse for everything else.

My oldish 24" LCD though is closer to 80W and is toasty to sit in front of. Most plain LCDs are not as frugal as they could be, typically 60W and up for 24" and bigger screens. Some of those 27" and bigger Hanspree and HP models are well over a 100W.

My next 24" will also be a 1920x1200 LCD from Lenova that uses a half power CC tube at 35W. Wonder why more CC tubes don't use that design since it compares well with LED on power.

Still LED back lighting really seems to be smacking LCDs now on power and the price difference is becoming minimal and could easily cover itself in short order. Also LED makes for much slimmer, lighter panels (and woblier too).


Also on ATX power supplies, most PCs use <80% efficient PSUs and the power factor is usually bad too. Compare the VA value against the Wattage value, VA is often 20% higher. Since most budget PCs can use a motherboard with built in graphics, they should be able to run on 60W or so esp if a 2.5" HD is used. That means the 300W PSU could easily be replaced by a micro PSU that fits entirely in the 24pin molex connector. These fanless Minibox supplies can deliver from 60W-150W saving space, noise and power and are >95% efficient. They do use an external 12V DC adapter though. Using one though will limit expansion options.

For most of my PCs, I also switched to refurbished 2.5" HDs drives leaving the 3.5" HDs only for mass storage used only when needed. Saves power, space and noise. Microcenter often has these for $15-$20 at 40-60GB size.

Reply Score: 2

RE: More
by zima on Wed 3rd Aug 2011 10:52 UTC in reply to "More"
zima Member since:
2005-07-06

Still LED back lighting really seems to be smacking LCDs now on power and the price difference is becoming minimal and could easily cover itself in short order. Also LED makes for much slimmer, lighter panels (and woblier too).

You know that marketers are taking over our world when even somebody aware of "LED back lighting" (of... LCD panels) contrasts that with... LCDs ;( ;)

Generally, we had a nice scam going around in retail, for some time. LED-backlit LCDs were probably less expensive to make from the very start of their mass-production. And, regarding labels, imagine the mess when actual (O)LED monitors will really show up en masse... (kinda like the fad of stereoscopy trying to take over the label "3D" - while there are IMHO much nicer approaches)

Oh, and only edge-lit panels seem to be much slimmer. The ones with a "matrix" of dozen+ LEDs at the back of LCD panel don't appear that different (they do make inexpensive panels much nicer when it comes to contrast, blacks, etc.)

Reply Score: 1

RE[2]: More
by transputer_guy on Wed 3rd Aug 2011 18:49 UTC in reply to "RE: More"
transputer_guy Member since:
2005-07-08

"LED-backlit LCDs were probably less expensive to make from the very start of their mass-production"

That is probably over stating the situation, but today the cross over point has more or less arrived for the sweet spot around 20" to 23". Today it seems CC tubes are finally going away for the commodity market.

For the higher end at the 24" to 30" panels the price difference is way more marked, perhaps the engineering is still tougher for large area edge lighting, and you also get into TN vs IPS issues and color gamut and professional use issues.

Reply Score: 2

RE[3]: More
by zima on Sun 7th Aug 2011 12:53 UTC in reply to "RE[2]: More"
zima Member since:
2005-07-06

Maybe, maybe not. The screens are generally almost identical, except for the type of edge-placed back-lights* ...which should be at least very comparable in cost (especially since CCFL need a relatively fancy power source); LED having probably lower future costs of disposal (which nowadays need to be often factored-in) or greater reliability (less warranty returns)

Additionally, overall, you don't need to search very far to find plenty of examples of human irrationality, especially when it comes to purchasing dynamics. I see no particular reason why it wouldn't be the case in this field. Or at least what was in the interest on non-consumers in the equation, to clear their warehouses and supply chains.

* And even for a notably different screens, those with a LED array behind the panel, something happened around half a year ago (in my part of the woods) which made them well entrenched into the segment of ...perhaps not the cheapest possible choices, but very much "a fairly typical, average-priced TV" (with large, ~40" LCD panels; yes, in a bit different league, always topping at 1080p for one; but I see no clear reason for back-light engineering to be much different in smaller, higher DPI panels)

Edited 2011-08-07 12:56 UTC

Reply Score: 1

Dual-graphics cards
by WereCatf on Wed 3rd Aug 2011 05:24 UTC
WereCatf
Member since:
2006-02-15

I just very recently bought myself a new laptop and noticed that it actually has two graphics cards: a low-power, low-performance one and a high-power, high-performance one. I can switch between them manually or the system can do it based on my power-profiles, though it doesn't switch the high-power one on when I e.g. start a game. I s'spose it's a shortcoming of Windows and the drivers.

Anyways, it made me wonder when will PC manufacturers actually start doing the same thing on desktop PCs. It would make sense, even if it'll add $15 to the cost straight-up it'll pay itself back pretty quickly for most people.

EDIT: As a side-note I've gotten the impression that switching graphics card on-the-fly is STILL not possible under Linux. I would think that such an ability would be very high on the to-do list, after all it does save power quite a bit, but I can't recall having seen anyone even thinking of working on that. Has this changed yet, does anyone know?

Edited 2011-08-03 05:34 UTC

Reply Score: 2

RE: Dual-graphics cards
by Neolander on Wed 3rd Aug 2011 08:54 UTC in reply to "Dual-graphics cards"
Neolander Member since:
2010-03-08

Have switchable graphics on my PC too, and I can attest that support in the Linux world is still in its early stages. Last time I tried, a few months ago, the computer locked up ;)

All I want myself is something that switches off the NVidia GPU and never turns it on again, since I don't use GPU-intensive software on Linux. Wonder if that's possible already...

Reply Score: 1

RE: Dual-graphics cards
by _txf_ on Wed 3rd Aug 2011 10:10 UTC in reply to "Dual-graphics cards"
_txf_ Member since:
2008-03-17

EDIT: As a side-note I've gotten the impression that switching graphics card on-the-fly is STILL not possible under Linux. I would think that such an ability would be very high on the to-do list, after all it does save power quite a bit, but I can't recall having seen anyone even thinking of working on that. Has this changed yet, does anyone know?


There is, with the ATI proprietary driver (powerXpress). Unfortunately you need unlink/relink to the ATI OGL library every time you switch.

Also there is some open source stuff that works with the radeon and intel gpus (I'm not sure it works with nouveau).

None of them are dynamic. You need to specifically switch, restart X (sometimes even restart the pc). Alas, the rather crufty graphics stack prevents dynamic switching...

Edited 2011-08-03 10:12 UTC

Reply Score: 2

RE[2]: Dual-graphics cards
by WereCatf on Wed 3rd Aug 2011 10:33 UTC in reply to "RE: Dual-graphics cards"
WereCatf Member since:
2006-02-15

You need to specifically switch, restart X (sometimes even restart the pc)


That's what I thought and that simply is not an acceptable solution. I hope someone gets around to do it properly, though.

That also means that I won't be using Linux on my laptop.

Reply Score: 2

My results
by spiderman on Wed 3rd Aug 2011 06:30 UTC
spiderman
Member since:
2008-10-23

My PC consumes 20W when used normally.
AMD Sempron 3400+, RAM: 1.5G, 2 hard drives, GeForce 7300 SE.
It runs GNOME with metacity (no OpenGl effects). The screensaver is a black screen (no OpenGl).
It tops at 80W when playing 3D games.
The screen consumes 25W: 19' iiyama.
It beats your MacBook. My guess is that Macs are bad at energy saving, because they use the hardware accelerated effects to display the desktop. Is that right?

One thing I have noticed is that the computer consumes energy when doing computing. These days, the graphic card is more powerful than the processor. It's the unit that consumes the most power. It dwarfs all the rest. Getting green hard drive is useless when you waste 80% of your energy in the graphics card. The hard drive's consumption is not significant.

Edited 2011-08-03 06:41 UTC

Reply Score: 4

More tips
by spiderman on Wed 3rd Aug 2011 07:03 UTC
spiderman
Member since:
2008-10-23

My computer consumes 1 to 2W when switched off!
I bought a plug switch and turn it off so it now consumes 0W when switched off.

Some people think there is a spike when the computer is turned on. There is none. If it takes 30 sec to shut down and 2 minutes to boot and you leave for 5 minutes, you will save energy if you switch it off. It's nothing like a diesel motor, not to mention that what people think about modern diesel motors is also false. Switching your motor off is also worth it.

If you are using a laptop as a desktop replacement, like mentioned in the article, and it is plugged to the wall, please, pretty please, remove the battery! You don't use it and you are charging it. There is an insane loss of energy to store it, then your battery will loose that energy over time by heat when the computer is turned off. Moreover your battery will live longer if you don't waste your limited cycles.

Reply Score: 2

RE: More tips
by matako on Wed 3rd Aug 2011 07:17 UTC in reply to "More tips"
matako Member since:
2009-02-13

Most desktop PSUs have hard switches at the back. That should take care of the stand-by power consumption.

Reply Score: 1

RE: More tips
by zima on Wed 3rd Aug 2011 08:39 UTC in reply to "More tips"
zima Member since:
2005-07-06

I would hope for average battery-charging software / logic to be better; to not recharge obtrusively when not really needed... (though this might involve a setting or two; still, probably less "taxing" than something so scary as manipulating a battery, for most users)

Overall, having a built-in UPS is too nice to abandon (and again, in a "desktop replacement" normally working off the mains, the battery doesn't need to be obtrusively topped-up at every opportunity)

PS. Funny thing with diesels, they actually scale exceptionally well their fuel usage down, when only fraction of their total power output is required or when idle. Yeah, people could stop being almost prejudiced about them.

Reply Score: 1

RE[2]: More tips
by spiderman on Wed 3rd Aug 2011 09:13 UTC in reply to "RE: More tips"
spiderman Member since:
2008-10-23

Well, it does not matter how clever the software is. Charging the battery costs energy, always. If you use it, when traveling for instance, you need it. But if your laptop is always plugged, just remove it. The ratio of energy consumed vs energy stored is insanely low. And you store it for nothing, it will dissipate in heat, and not so slowly.

Reply Score: 2

RE[3]: More tips
by zima on Wed 3rd Aug 2011 09:33 UTC in reply to "RE[2]: More tips"
zima Member since:
2005-07-06

But that's the thing, "smart" software can be set in such scenario to essentially not use the battery

Yes, when kept in warmer environment of a laptop it will dissipate slightly faster; overall being recharged more; but nothing too dramatic. Especially versus the perks of built-in UPS (and I write it especially from the point of view of "average user" who seems to have a hard time protecting (saving data, for example) against sudden power loss, and we are supposed to "get their data back" after the fact...)

Reply Score: 1

Saving electricity?
by Fabimaru on Wed 3rd Aug 2011 07:44 UTC
Fabimaru
Member since:
2009-01-29

The problem is that it is a trade-off between saving electricity and having equipment that last longer. Laptop tends to have a relatively short life-time, and when something (screen, keyboard...) stop functioning, it is not easy to replace this part, especially after the end of the warranty. So people tends to buy a whole new laptop. Building and getting rid of electronic devices have great environmental costs (how much pure water to make a processor).
A CRT screen is by far less convenient than a flat panel, but the latest is not nice at all when it comes to dismantling.

Reply Score: 2

RE: Saving electricity?
by zima on Wed 3rd Aug 2011 08:12 UTC in reply to "Saving electricity?"
zima Member since:
2005-07-06

It's not much of a trade-off in practice - at worst, you just use it until it fails. I think most units don't fail throughout their life, and people still get new ones (too often because "old computer started to be broken & too slow" aka just filled with crapware*)

Reply Score: 2

RE[2]: Saving electricity?
by renox on Wed 3rd Aug 2011 08:15 UTC in reply to "RE: Saving electricity?"
renox Member since:
2005-07-06

It's not much of a trade-off in practice


I disagree: my 10 year old desktop PC is still going quite well (except for blue screen from time to time, don't know if this is a HW or SW issue) whereas I know someone who had a laptop which failed just after the end of the warranty.

Reply Score: 2

RE[3]: Saving electricity?
by zima on Wed 3rd Aug 2011 08:26 UTC in reply to "RE[2]: Saving electricity?"
zima Member since:
2005-07-06

So? "I have" / "I know someone" is not only a poor input vs. overall trends (how people don't seem to hold on to working machines often enough), you also essentially reaffirm my very own assertion that, yes, it's not something you can plan around much (please, please, drop the confirmation bias of "they don't build them like they used to" - we just remember positive specimens better, the few still working doubly so), so at worst, you just use it until it fails.

Edited 2011-08-03 08:27 UTC

Reply Score: 0

RE[4]: Saving electricity?
by spiderman on Wed 3rd Aug 2011 09:03 UTC in reply to "RE[3]: Saving electricity?"
spiderman Member since:
2008-10-23

No, he's right. 1 out of 15 laptop fails in the first year, 1 out of 5 laptop fails before its 2nd birthday and 1 out of 3 laptop fails in 3 years. And if you have a HP laptop, you have 1 chance out of 4 it will fail in 2 years. It's not a myth. Laptops have a very poor life expectancy. I can confirm this with my experience as well.

Reply Score: 2

RE[5]: Saving electricity?
by zima on Wed 3rd Aug 2011 09:47 UTC in reply to "RE[4]: Saving electricity?"
zima Member since:
2005-07-06

Again, it doesn't change how they are too often replaced before hitting their limits (likewise for desktops of course; NVM how comparative data between the two classes are still missing, or how laptops are also a bit more likely to endure physical hardships, that's not the point)

It's not much of a trade-off when, too often, people aren't very willing to hold on even to working equipment.

Reply Score: 1

RE[6]: Saving electricity?
by spiderman on Wed 3rd Aug 2011 10:36 UTC in reply to "RE[5]: Saving electricity?"
spiderman Member since:
2008-10-23

When a laptop fails, it means it is used. It does not fail in the trashcan or in the recycling factory.

Reply Score: 2

RE[7]: Saving electricity?
by zima on Wed 3rd Aug 2011 11:04 UTC in reply to "RE[6]: Saving electricity?"
zima Member since:
2005-07-06

Which, again, doesn't exclude other (and IMHO way too widespread of course) scenarios, practices.

Reply Score: 1

RE[5]: Saving electricity?
by Bill Shooter of Bul on Wed 3rd Aug 2011 15:08 UTC in reply to "RE[4]: Saving electricity?"
Bill Shooter of Bul Member since:
2006-07-14

No, he's right. 1 out of 15 laptop fails in the first year, 1 out of 5 laptop fails before its 2nd birthday and 1 out of 3 laptop fails in 3 years. And if you have a HP laptop, you have 1 chance out of 4 it will fail in 2 years. It's not a myth. Laptops have a very poor life expectancy. I can confirm this with my experience as well.


[citation needed]

Any one person's experience is not statistically relevant. You cannot confirm anything with your experience, even if you were in an industry where you worked with a large number of laptops.

Your experience can lead to a more scientific inquiry to investigate any claim arising from personal experience.

I realise I'm being a bit pedantic, but too many people day in and day out spout similar things about how what they experience, like, prefer is somehow universal, because they are special or something. And, yes, I'd count myself among those. I try not to draw too much conclusions from my own experience, but occasionally its interesting to see some raw unscientific observations as well. Just keep in mind that its not necessarily the case. Computers, people, things, life, existence, the universe, are all complex things with lots of small interacting parts. Its really easy to screw up our understanding of everything. Brilliant people do that everyday.

Reply Score: 2

RE[6]: Saving electricity?
by spiderman on Wed 3rd Aug 2011 15:22 UTC in reply to "RE[5]: Saving electricity?"
spiderman Member since:
2008-10-23

Here is one citation: www.squaretrade.com/htm/pdf/SquareTrade_laptop_reliability_1109.pdf
There has been several studies about laptops failure rates. It is confirmed by my experience but I based the numbers I cited on real studies over hundreds of thousands of laptops. Google laptop failure rate for other citations if you need. They all end up with pretty similar numbers.

Reply Score: 2

RE[7]: Saving electricity?
by Bill Shooter of Bul on Wed 3rd Aug 2011 16:01 UTC in reply to "RE[6]: Saving electricity?"
Bill Shooter of Bul Member since:
2006-07-14

Thanks for the citation. It makes everything easier to examine.

For instance: the three year failure rate is not 1/3. Its 1/5. 1/10 people accidentally break their laptop.

Also, there is a bit of potential selection bias going into the study. This looks at laptops covered by a third party warranty. So, more accurately it shows that 1/5 laptops owned by people who think their laptop may fail, fail within three years. Those who don't pay for the warranty may take better care of their laptops ( Ie not storing it in a hot car, not leaving it on 24/7, not using it to level out the kitchen table, not running the battery till death for the fun of it, using an additional cooling pad ect).

Again, I really don't have a dog in this fight, we were saying that laptops break more often or something? I'm just complaining about people's throwing around antidotes and not understanding statistics. Its of minor importance in this case, but we suck just as much when it is important... like economics, politics, and health issues.

Reply Score: 2

White instead of black
by Dr-ROX on Wed 3rd Aug 2011 08:00 UTC
Dr-ROX
Member since:
2006-01-03

For LCD screens, you can make a screensaver, that shows white color instead of black. LCD works in that way, that power is needed to make black color. White is the natural state of the screen, so it only uses power for backlight. And if you turn of the screen light, then it saves power completely. Try white color with a voltmeter.
With plasma and CRT screens black is the natural state of them, so showing black should save power. But those screens are eating more of it naturally.

Reply Score: 1

RE: White instead of black
by Neolander on Wed 3rd Aug 2011 08:58 UTC in reply to "White instead of black "
Neolander Member since:
2010-03-08

What about LED-backlit LCD ? Aren't they smart enough to selectively turn off some of the white LEDs when displaying black ?

Ah, can't wait until we have OLED and transflective screens everywhere ;)

Reply Score: 1

RE[2]: White instead of black
by David on Wed 3rd Aug 2011 17:02 UTC in reply to "RE: White instead of black "
David Member since:
1997-10-01

I believe that most LED backlights don't actually have a lot of LEDs behind the screen (known as "full array") that can be turned on and off, but rather have LED emitters at the edges of the screen that uniformly illuminate the area behind the LCD panel.

There is a technology known as "local dimming" that works like you describe, but it's only in high end TVs at this point. I don't know if anyone is making computer monitors with local dimming.

Reply Score: 1

RE: White instead of black
by transputer_guy on Wed 3rd Aug 2011 18:39 UTC in reply to "White instead of black "
transputer_guy Member since:
2005-07-08

I think you are quite wrong there.

It is a fact that the LCD screen pixel structure is only about 5% efficient at letting the light through when fully on, that means 95% of whatever the light source is either CC tube or LED is blocked and is therefore turned to black body heat. If the screen is black it would just be 99.9% instead. To save power means controlling the light source power.

There was/is a company called Unipixel that had a LCD alternative that claimed 60% efficient light cell based on a neat MEMs opto capacitive structure. They even licensed it to Samsung in 2009, not a squeek since though. When driven by side lit LEDs, its power use would be far below todays LED LCD panels which are mostly already good enough.

Reply Score: 2

Nitpick
by Neolander on Wed 3rd Aug 2011 08:31 UTC
Neolander
Member since:
2010-03-08

A voltmeter would tell you that your computer is fed with 230V AC all the time, unless you have some really bad power grid around. I suggest using a wattmeter instead, although it's open to discussion. ;)

EDIT : Ooops, someone said it before me.

EDIT 2 : Excellent research, by the way. You've found quite a lot of references to prove your point !

Edited 2011-08-03 08:38 UTC

Reply Score: 2

RE: Nitpick
by zima on Wed 3rd Aug 2011 09:35 UTC in reply to "Nitpick"
zima Member since:
2005-07-06

By the looks of the power socket, I guess that would be 110V AC ...and I didn't even need a voltmeter to assess it, remotely! ;)

Too bad, it makes the article a bit mixed bag. On one hand, as you say, nicely sourced. But Watts / Volts mix-up (especially since the latter are essentially constant in such scenarios) casts a shadow... ;) No, really, with such basic mistake repeated throughout, it unnecessarily makes the whole article suspect from the start; about the writing process, the author behind it, how much of it can be depended upon, etc.

(don't get me wrong, obviously quite a lot, it has useful practical guidelines; but BTW, I can't help but notice how immense portion of it is what I would hope to be common sense knowledge :/ - even bordering on "perpetuum mobile doesn't exist" - which, sadly, probably isn't common sense knowledge, hence the usefulness... though I'm not sure if OSNews is the best channel)

PS. And getting another wattmeter is not strictly required when... virtually all dwellings already have a very precise central one. Considering how rarely we would fiddle with this, disconnecting every other electrical energy sink for the time of PC experiments is fairly trivial; you can temporarily move the PC closer to it, too. Absolutely basic arithmetic will take us most of the way to per-socket one; especially since for calculations we see averaging & "assuming" anyway (of course that assumes somebody can notice, connect the dots between energy usage of each device vs. overall bill; but it's also required with portable wattmeter)

Heck, promoting such portable wattmeters when there's already a very good one (and not really very inconvenient) in every house, is itself a bit of a waste of resources, energy ;)

Reply Score: 1

RE[2]: Nitpick
by Neolander on Wed 3rd Aug 2011 09:40 UTC in reply to "RE: Nitpick"
Neolander Member since:
2010-03-08

Ah, you're right, totally forgotten that not every part of the world used my fellow voltage standards ;)

Though about wattmeters, it's my turn to accuse yourself of being region-specific ^^ In France at least (should check in Sweden while I'm here), old-fashioned mechanical watt counters with a spinning thingie (1 turn = X kWh) are still common, and using them for wattage calculation is quite cumbersome and imprecise.

Reply Score: 3

RE[3]: Nitpick
by zima on Wed 3rd Aug 2011 10:06 UTC in reply to "RE[2]: Nitpick"
zima Member since:
2005-07-06

Such meters are also still standard at my place; I don't really see them as "quite cumbersome and imprecise" (vs. cheap portable meter), especially since for calculations we see averaging & "assuming" anyway (overall, total usage over time is what we need to have in mind, if energy conservation is to become a routine)

And might I add that, in your unsound accusations, you actually gave me an opportunity to berate you so much more strongly! ;)
I've never even been anywhere close to a 110V place, I just remember they exist.

Reply Score: 1

RE[4]: Nitpick
by righard on Wed 3rd Aug 2011 11:34 UTC in reply to "RE[3]: Nitpick"
righard Member since:
2007-12-26

Are there any 110 volt zealots or 230 volt fanboys, I want to see there flamewars.

Anyway, anybody dumb enough to live in a 110 volt area deserves to have there equipment broken earlier and there have computers run slower.
230 volt is the true path to enlightenment.


Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator.

Reply Score: 2

RE[5]: Nitpick
by Neolander on Wed 3rd Aug 2011 11:56 UTC in reply to "RE[4]: Nitpick"
Neolander Member since:
2010-03-08

I dream of a building where there is a huge central heat pump. It cools down all fridges, computers, and freezers and on the other end distributes heat to whatever needs it. In every room, you find faucets for hot and cold fluid that you can use to heat and cool down things as needed.

(Flamewar contribution : 110V AC and 230V AC are both lame, DC current makes the chances of survival to an electric shock much higher* and is the One True Way ;)

* An irrelevant side-effect of DC being slightly reduced power grid, generator, and transformer efficiency)

Edited 2011-08-03 12:11 UTC

Reply Score: 1

RE[5]: Nitpick
by David on Wed 3rd Aug 2011 17:04 UTC in reply to "RE[4]: Nitpick"
David Member since:
1997-10-01

Wow. OSNews readers can come up with a flamewar about anything! :-)

Reply Score: 1

RE[5]: Nitpick
by transputer_guy on Wed 3rd Aug 2011 18:26 UTC in reply to "RE[4]: Nitpick"
transputer_guy Member since:
2005-07-08

Back in the very earliest days of electricity production, I'm sure there must have been heated debates over AC vs DC and 110-220-240 Volts and so on. The players would have included Edison, Tesla and other notable figures.

"Ontopic: I always thought it stupid that I had to cool down my pc with a cooler while at the same time heating my room. Maybe there is a way to connect a cpu with a radiator."

I had been thinking the same thing too, at least when I was using a D805 + 2 large CRTs that typically used 400W continuous. How to get the heat away from me and towards a more useful purpose. Anyways its all moot now.

Reply Score: 2

Cold summer
by RshPL on Wed 3rd Aug 2011 08:48 UTC
RshPL
Member since:
2009-03-13

Power saving is all right but I am really not enjoying this cold summer ... hence I leave my computer ON to contribute to global warming. It's a small contribution but it's all I can afford.

Reply Score: 2

RE: Cold summer
by transputer_guy on Wed 3rd Aug 2011 18:54 UTC in reply to "Cold summer"
transputer_guy Member since:
2005-07-08

So where in the world is summer too cold?

In the US we have probably had enough of the heat dome effect, although in Mass we are not so affected.

You want to start a flame war on global warming seriously!

Reply Score: 2

RE[2]: Cold summer
by RshPL on Wed 3rd Aug 2011 20:42 UTC in reply to "RE: Cold summer"
RshPL Member since:
2009-03-13

I often joke about it ... getting warmer does not motivate me to save power, quite the opposite. ;) I do not believe in global warming being mainly caused be men and by whatever (sun for example) it is mainly caused by, I would seriously enjoy more heat from it - writing from Central Europe. ;)

It also depends where the power comes from ... cutting on other harmful gases is OK, but CO2 is not the one we should avoid!

Reply Score: 1

RE[3]: Cold summer
by smashIt on Wed 3rd Aug 2011 21:05 UTC in reply to "RE[2]: Cold summer"
smashIt Member since:
2005-07-06

cutting on other harmful gases is OK, but CO2 is not the one we should avoid!


co2 is one to be avoided, but not the only one

but like with most big tasks:
you have to make the first step
and every little bit counts

so don't be lazy and turn off all devices you don't need
instead of a bigger airconditioner beef up the insulation of your hous
put some solar collectors on your roof because the sun shines for free

there is so much you can do today...

Reply Score: 2

RE[4]: Cold summer
by RshPL on Wed 3rd Aug 2011 21:34 UTC in reply to "RE[3]: Cold summer"
RshPL Member since:
2009-03-13

CO2 is healthy for the vegetation, all the plants rely on it so what the hell? I am all for saving the planet, not polluting environment, not using rivers for waste dumps, saving endangered species etc... but CO2 is just silly. Even if global warming was harmful (I don't know), CO2 is a very minor greenhouse effect gas, and human produced CO2 is even less significant.

Reply Score: 1

RE[5]: Cold summer
by smashIt on Wed 3rd Aug 2011 22:05 UTC in reply to "RE[4]: Cold summer"
smashIt Member since:
2005-07-06

and human produced CO2 is even less significant.


well, thanks to humans the ammount of co2 in the air has increased by 20% over the last 50 years
and thats a hell of a lot of co2

and to quote wikipedia:
NOAA states in their May 2008 "State of the science fact sheet for ocean acidification" that:
"The oceans have absorbed about 50% of the carbon dioxide (CO2) released from the burning of fossil fuels, resulting in chemical reactions that lower ocean pH. This has caused an increase in hydrogen ion (acidity) of about 30% since the start of the industrial age through a process known as “ocean acidification.” A growing number of studies have demonstrated adverse impacts on marine organisms, including:

The rate at which reef-building corals produce their skeletons decreases, while production of numerous varieties of jellyfish increases.
The ability of marine algae and free-swimming zooplankton to maintain protective shells is reduced.
The survival of larval marine species, including commercial fish and shellfish, is reduced."

Reply Score: 2

RE[6]: Cold summer
by RshPL on Wed 3rd Aug 2011 22:19 UTC in reply to "RE[5]: Cold summer"
RshPL Member since:
2009-03-13

Thanks for bringing up other effects of CO2 increase. I do not believe in it contributing to the global temperature in a serious matter (quite the opposite, temperature increase, increases CO2 natural production) but it might affect life in other ways, possibly harmful as you have pointed out.

http://dangerousintersection.org/wp-content/uploads/2006/09/CO2-Tem... This is often pointed out as a proof of CO2/temperature relationship. Careful examination shows that it's the CO2 that lags after temperature.

Edited 2011-08-03 22:21 UTC

Reply Score: 1

RE[7]: Cold summer
by transputer_guy on Thu 4th Aug 2011 13:25 UTC in reply to "RE[6]: Cold summer"
transputer_guy Member since:
2005-07-08

Your are spewing out the usual climate denial junk we expect to see from the wattsupwiththat crowd as well as the various conservative think tanks funded by Exxonmobile,the Koch brothers or the tobacco industry at the Heartland Institute. Are you a stool for those interests or can you study think for yourself.

If what you say is true, write a paper on all your beliefs and get it published in a peer reviewed journal. You will find your points have all been debunked.

It is one thing to be a conservative/republican, it is quite another to buy into the idea that conservatives automatically have to work for free for Koch industries to promote coal and fossil fuel burning.

It is a simple fact humans have increased CO2 levels from around 250 to 460 ppm since the industrial revolution started, nature can not do so that fast. Since China and India have joined the energy party, we are likely headed to 700 ppm in the next century with no stopping. Last time CO2 levels were this high due to nature, was eons ago when life and flora was very different.

And yes CO2 is good for vegetation up to a point. It is also a trace gas, and physics says CO2 is a warming gas although water vapor and methane are much worse. The Oxygen, Nitrogen, Argon that make up the bulk of the atmosphere are not warming gasses so indeed the remaining gasses can and do make a huge difference.

The CO2 normally cycles through the system over very long periods of time, about the same amount goes into the atmosphere as is taken out by natural processes. The human load is just a small push of a few percent but it is always adding, so the CO2 level drifts upwards, that is simple integration math.

I could go on but you could learn more from your own research.

Personally I think that the only way to a future of guilt free plentiful energy for all the nations is Nuclear power from Thorium LFTR, despite all the green loons anger at Nuclear. Nuclear energy is millions of times energy denser compared to solar in any form and there is enough Th for the entire planet to live well for thousands of years until fusion works. It can even help rid the world of the nuclear waste from weapons and regular nuclear plants.

google kirk sorenson thorium energy

Reply Score: 2

RE[6]: Cold summer
by unclefester on Thu 4th Aug 2011 11:25 UTC in reply to "RE[5]: Cold summer"
unclefester Member since:
2007-01-13

You obviously don't know any chemistry because the treat of acidification is exactly zero. It is literally impossible:

- pH is logartithmic. In other words a pH of 6 is 10x as acidic as a pH of 7.

- Carbonic acid (dissolved CO2) is weak acid. A 30% increase in dissolved CO2 has negligible impact on the pH of seawater.

- pH is highly temperature sensitive. The pH of seawater changes far more in due to temperature changes than atmospheric CO2 levels.

- seawater cannot become acidified (or significantly less basic) by CO2 because it is heavily buffered by dissolved salts.

- Ocean acidification has never occurred even with CO2 concentrations 20x as high as present.

Reply Score: 2

OS propaganda
by spiderman on Wed 3rd Aug 2011 10:42 UTC
spiderman
Member since:
2008-10-23

On the desktop, linux consumes 100 times less energy than windows and 5 times less than MacOS.
Gentoo consumes 1000 times less energy than Ubuntu.

Reply Score: 2

RE: OS propaganda
by WereCatf on Wed 3rd Aug 2011 14:15 UTC in reply to "OS propaganda"
WereCatf Member since:
2006-02-15

On the desktop, linux consumes 100 times less energy than windows and 5 times less than MacOS.
Gentoo consumes 1000 times less energy than Ubuntu.


I'd just LOVE to hear the maths behind that claim. I mean, if a Windows PC consumes e.g. 300W while powered on, it would only consume 3W when running generic Linux, and 0,003W when running Gentoo...

Or in other words: you have no idea what you're saying.

Reply Score: 1

RE[2]: OS propaganda
by Neolander on Wed 3rd Aug 2011 14:43 UTC in reply to "RE: OS propaganda"
Neolander Member since:
2010-03-08

Sarcasm failure detected.

Reply Score: 1

RE[2]: OS propaganda
by spiderman on Wed 3rd Aug 2011 15:11 UTC in reply to "RE: OS propaganda"
spiderman Member since:
2008-10-23

Well, the math is really simple.
A linux desktop consumes roughly the same amount of energy as a windows desktop. There are 100 times more windows desktops than linux desktops. Therefore windows consumes 100 times more power than linux on the desktop.
The same math applies when comparing gentoo to ubuntu.
I thought you would get it.

Reply Score: 2

1,778 kWh for 2 months?
by spinnekopje on Wed 3rd Aug 2011 13:25 UTC
spinnekopje
Member since:
2008-11-29

I guess 650 kWh for 2 months would be more accurate for me.. and that includes heating and cooking.

When someone has to get new electrical wiring in the house I can give you one advise: make sure you can switch off most power groups/outlets from the main board. In my case it will be less than 10 years before I saved the initial cost, even when I only take the kitchen into account.

I do plan to build a new desktop, but that one has to be power efficient and should last at least 6-7 years. It's main purpose will be for more demanding programs like editing photos, running virtual machines, .. When I want to surf, chat, mail I will still use a cheap laptop.

Reply Score: 1

Confusion about electrical quantities.
by r00kie on Wed 3rd Aug 2011 13:36 UTC
r00kie
Member since:
2009-12-10

Like others have said what you want is a wattmeter not a voltmeter, no question about that.

The device you bought, judging by the name, is a wattmeter so you got the device right ;) .

Another thing is that any device will use power when turned off _unless_ there is a mechanical switch to completely turn it off, this also applies to hibernation mode.

When it comes to real devices the DOE definition doesn't mean much, how much each device uses when on (soft) turned off cannot be predicted, it can be less than 1 watt or reach several watt, you need to measure it to find out how much each device uses.

Reply Score: 1

Abuse of units
by thejpster on Wed 3rd Aug 2011 15:49 UTC
thejpster
Member since:
2010-10-01

Wow. I nearly cried. It's difficult enough to explain the concept of energy and power to people without confusing them with unit abuse.

Energy is measured in Joules. The rate you use energy is Power and is measured in Joules per second, or Watts. People don't like multiplying the wattage of their equipment by the number of seconds it's on (to give Joules) as the numbers are big, so they multiply the wattage by the number of hours, to give kilowatt-hours. That's not kilowatts per hour, that's kilowatt hours (like pound-feet, or Newton-metres). One kilowatt-hour is 3.6 Megajoules.

Your energy company is only concerned how much energy you have used (in Joules, or Kilowatt-hours), not the rate at which you use it (in watts).

Don't feel too bad. Even eon (the large UK utility company) make mistakes. See the Scroby Sands Wind Farm display in Great Yarmouth, England talking about the number of homes per year that 60 Megawatts can supply.

Reply Score: 1

Errors
by jefro on Wed 3rd Aug 2011 16:15 UTC
jefro
Member since:
2007-04-13

That meter is a multimeter. It shows more than volts.

LCD's can use as much as a CRT!!! Some LCD's use almost 200watts. Look for led lcd's that use no more than 30 watts active.


I monitor my energy use. Most of the older wall warts use 3 watts all the time. If you have ten in your home that is 30 watts all the time. Like putting a 700 watt bulb on for an hour each day. Who would do that?

Put every item in your system on a power strip. When you are not at the computer turn it all off and use the power strip switch. Power strips also help prevent voltage spikes and in some cases lightning.

Edited 2011-08-03 16:17 UTC

Reply Score: 1

RE: Errors
by transputer_guy on Wed 3rd Aug 2011 19:19 UTC in reply to "Errors"
transputer_guy Member since:
2005-07-08

"LCD's can use as much as a CRT!!! Some LCD's use almost 200watts"

While shopping for a new large panel, I couldn't help notice that too. The Hanspree 28" panel that gives us the 16:10 1920x1200 for $250 was pretty appealing except that it sucks like 110W or so. Two of those would kill desk space as well. The Apple 30" IPS panel also use 180W IIRC but price is beyond my budget and the resolution is too high for older eyes. I think the IPS panels also need more source light since the IPS switch is less efficient. LG make some 22"-23" models with the option of IPS vs TN for very small difference in $ and Watts.

Having said that, you have to use a Watt meter to see if the specs are true or just over stated. When I bought my Panasonic Plasma TV the specs suggest upwards of 300W in use. I checked it in the store with a Watt meter (the first time anyone ever did that) and it was half that, and at home it was around 100W.

Also something most people are not considering is the VA rating rather than Watt level. Most appliances use more VA than Watts, we pay for Watts in KWhr charges but the utility must produce VAs about 20% more. That means they have to balance the phase by over producing power.

LED night lights and a lot of DC powered device use really crappy AC-DC circuitry that use far more power than the DC rating suggests. Set your meters to VA to see the difference. The industry really needs to push harder for power factor of 1 so VA equals Watts. That requires better quality switchers, some PC PSUs do have power factor correction in them, most don't.

Reply Score: 2

Multimeter!
by kateline on Wed 3rd Aug 2011 17:56 UTC
kateline
Member since:
2011-05-19

When I search "voltmeter" and "wattmeter" at Amazon I get about the same list of devices in the results. Most of them are multimeters with the ability to provide several different kinds of output. You saved yourself from confusing everyone by including the picture. BTW, the Kill-A-Watt you show provides several kinds of outputs (as is typical) and will calculate the kilowatt-hours for you without need for manually cranking through your formula.

Reply Score: 1

Other Ways
by fretinator on Wed 3rd Aug 2011 19:25 UTC
fretinator
Member since:
2005-07-06

How to Save Energy When Using Your Computer
-------------------------------------------------------------

1. Lay down and type with one hand.

2. Never, ever read the linked articles.

3. Use a Tandy Model 100 laptop, dialed in with a 300 baud modem to your ISP, surfing the web with Lynx.

4. Do not turn your computer on, just stare at the pretty reflection in the monitor.

5. Use binocculars to watch your neighbor surf the web (or so you say).

6. Go wireless - just imagine what the web pages would look like in your mind.

7. Pick up a land-line phone and imitate a modem. If you're good, you can connect.

8. e = mc^2 - not sure how this helps, thought I would throw it out there,

9. Stop frowning when Thom posts another Software Patent article. EVERYONE knows you use less muscles when you smile :}

10. Don't thiink too hard about what you post!

Reply Score: 2

DIY cases
by transputer_guy on Wed 3rd Aug 2011 20:19 UTC
transputer_guy
Member since:
2005-07-08

Some what off topic, I have recently gotten really sick of seeing the usual black PC cases all around my house, when in reality they were all mostly empty. They were all made from cheap upgrade parts though so the cases were just getting recycled as the insides got ever sparser. Mostly Sempron systems needing 60W or so. They all had noisy fixed speed PSU fans and stock CPU coolers and were dust bunny collectors.

At about 16" x 18" x 8" the volume is unsightly in regular rooms so I built my own wooden cases using spare floor laminate and some skilled use of the table saw. These are about 10" by 9.5" by 6" and are exactly 1/4 of the volume of the metal beasts. They are just big enough to throw in most cheap mobos with a stripped down PSU and a 2.5" HD. The stock coolers got replaced by $10 heap pipe tall stack coolers which allows for very slow quiet fans. A second fan cools the PSU, both fans are on a regulator set near minimum. They still need more detail work and I doubt the FCC would be pleased.

This still is not satisfactory because even these boxes are mostly empty although stuffed with excess PSU wiring. What I have in mind for the next phase is to mount the mobo directly on the back of the LCD panels with VESA mount bolts and use a miniBox type PSU switcher set into the 24pin connector. The tall heat pipe coolers that now sit on the CPU now stick out. It would make the LCD/PC look more like an old TV with the backwards pyramid.

What I wish for now is to find a way to retain the tall heat pipe stack technology but flatten it over the mobo surface so the whole thing can be packaged in a slim book like package of about 10" square and maybe 2" thick. I would like to make or buy a heatpipe cooler integrated with a flat plate ribbed heat sink. There are still the issues of video cables, can you even get a VGA/DVI cable of 1 foot. All of this is an effort to make the PCs disappear behind the display and also get rid of the wiring tangle.

Of course I could just buy all in one PCs or laptops or iPads but that wouldn't be any fun and those have serious other issues. All of those have displays that are way too small and use laptop technology.

Its just a hobby though, looking for more is less.

Reply Score: 2

Link to this article!
by benali72 on Thu 4th Aug 2011 09:06 UTC
benali72
Member since:
2008-05-03

If you have a website, link to this article.

Every computer user should read it.

If everyone followed its recommendations we would save a lot of energy (at little cost).

Reply Score: 1

Co2
by jefro on Thu 4th Aug 2011 20:56 UTC
jefro
Member since:
2007-04-13

I watched a show about C02.

They said an average of a few hundred thousand years was a number like 225 and the highest recorded (if you believe their assumptions) was like 380.

Today's readings are like 590 and moving up. I can't imagine how that can be good.

Reply Score: 1

kovacm
Member since:
2010-12-16

buy a Mac

Reply Score: 1

smashIt Member since:
2005-07-06

buy a Mac


jobs is the king of planned obsolescence
buy a rotten apple if you enjoy growing piles of rubbish in africa

Edited 2011-08-04 23:11 UTC

Reply Score: 2

Inkjet? Are you nuts?
by UltraZelda64 on Fri 5th Aug 2011 08:20 UTC
UltraZelda64
Member since:
2006-12-05

Your articles--while based on interesting concepts (reusing old machines, extracting every ounce of usefulness out of computer hardware, electronics disposal, energy saving, etc.)--always seem to have "WTF?!" moments that make me question what the hell you're talking about.

I could probably question other things in this article (I've stopped reading them in their entirety a while ago), but I'll just say this: I would use a pen and notebook paper before I'd ever use or recommend anyone to use an inkjet printer again. They are garbage, laser is the way to go.

Toner doesn't need "cleaned" after a week of no use/printing, requiring a new set of cartridges every month because almost all of the ink is lost while cleaning. Laser just works with minimal problems, and if you rarely print anyway, it's not like you'll be using loads of energy anyway. And if you're afraid to use too much energy--there's always the "On/Off" switch.

Simply put, inkjet is a horrible recommendation, even for those people who for whatever reason are so paranoid about their energy use; what they "save" in electricity will be eclipsed several times by the cost of regularly buying replacement cartridges. It's 2011, laser printer prices have gone way down--how could anyone even consider recommending inkjet?

Reply Score: 2