Linked by Thom Holwerda on Wed 29th Sep 2010 22:14 UTC, submitted by Amix
Morphos Bright days ahead for the Amiga world. AROS is doing well, AmigaOS4 is getting one heck of a machine in the AmigaOne X1000, and MorphOS continues its development at a brisk pace. Version 2.6 of MorphOS, currently in development, will add support for (G4, I'm assuming) PowerMacs, which, alongside support for the Mac Mini and eMac, gives MorphOS a solid base of used hardware to run on.
Thread beginning with comment 443374
To view parent comment, click here.
To read all comments associated with this story, please click here.
Member since:

Sorry to tell you that, but your physics is not quite right.

Wattage (in W) measures the flux of power, how much you consume per second. It's a characteristic of your computer.
The total amout of energy consumed during a certain time (in J or Wh), on the other hand, is not. It depends on how large the time interval considered is. Making the computer run for a long time will make it consume more energy, no matter how power-efficient it is, at it eats up energy permanently.

So in order to get scientifically neutral results, we have to know the amount of time involved in each measurement.

Reply Parent Score: 2

spiderman Member since:

Sorry, I confused kw with watts. My computer actually consumes 24w, not 24kw.
When I say 24wh, I mean that if the computer ran for 1 hour at that level of wattage, it would consume 24wh. So it actually consumes 24w.
The amount of time does not matter. I've made several measurements and the result is repeatable. My computer consumes 24w when not doing heavy graphics work. It is the same when in stand by mode or when booting. Leaving it in stand by mode for one hour is 24wh. Booting for 3 minutes is 24wh*3/3600 = 0.02wh.
So leaving the computer in standby mode for one hour is 24wh and booting the computer is 0.02wh. If I have compiz running, the result is between 75w and 95w.

Anyway, whatever the maths. The point is that the computer does not consume significantly less in stand by mode than when doing light work, that it does not consume significantly more when booting and that it does consume more than 3 times that when doing 3D work. So yes, switching off the computer when not using it is by far more efficient and using the nv driver is more efficient than the nvidia one (but then having a powerful graphics card becomes useless)

So my advises are:
* Don't buy a powerful graphics card when you are not going to use it.
* If you have a powerful graphics card for a purpose (games), don't use it for the desktop.
* Switch off the computer when not using it.
* Even better, buy a switch for the power plug. The computer consumes energy even when switched off (mine consumes about 5w)!
* Don't use CRT screen and if you do anyway, use a dark wall paper. That can halve the energy consumption on a CRT screen.

Edited 2010-10-01 07:55 UTC

Reply Parent Score: 2

Neolander Member since:

Many thanks for pleasing the physicist ;)

Indeed, those results are interesting. Aside from proving that standby is a waste of power, it shows how much hybrid graphics, PixelQI screens, and non-accelerated desktops are interesting subjects that should be worked on.

Edited 2010-10-01 08:41 UTC

Reply Parent Score: 2