ATI Technologies said Friday it had employed technologies originally designed to reduce energy consumption of graphics processors for notebooks to trim power hunger of high-end desktop graphics cards, such as Radeon X1800 or more advanced.
When X-bit labs originally measured power consumption of high-end Radeon X1800 XT graphics card back in September, 2005, it was about 112W under maximum recently, the absolute maximum for that time. However, when the measurements were carried out later, the power consumption dropped to slightly below 103W on the same graphics card with the same BIOS version, but on a newer driver.
maybe they should also start thinking bout power consumption like intel and amd with there processors, its not anymore about raw power but also efficiency. SLI two of nvidia’s latest vid cards and watch your electricity bill soar up, not to mention a constant room heater, its crazy. this is a good step. Video cards are one of the components that eat up most of a psu’s available power
Now if Asus will only follow suit eliminating the external brick on those dual geforce cards I’ll be able to finish my quad gpu SLI gamebox. It’s just so hard to find three available outlets at a lan party.
112W, just for a video card? That is ridiculous (as in: crazy, over the top).
All videocards I’ve ever owned have been with passive cooling, which puts their power budget in the order of 10W, maybe even less (much more, and passive cooling would kill the GPU). No plans on changing that policy for me. Never had much problems playing modern games. If there was a bottleneck, then elsewhere (CPU speed, memory).
But hey, maybe I didn’t try enough of them. And for real ‘enthousiasts’, it’s never enough, is it?