Linked by Thom Holwerda on Tue 3rd May 2011 13:33 UTC
Apple Apple has updated its line of iMac desktops, with new processors and bumped graphical specifications. They also includes an HD FaceTime camera, which is certainly useful. The most interesting feature is, of course, that they include Thunderbolt ports - one on the 21.5" model, and two on the 27" model.
Thread beginning with comment 471789
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[5]: radeon M?
by Neolander on Wed 4th May 2011 16:18 UTC in reply to "RE[4]: radeon M?"
Member since:

Agreed; side issue, I'd be concerned if the fans didn't switch on under a heavy load lol. It is amazing the number of Mac and non-Mac users who try to claim that 'fans going equals bad design' whilst ignoring what the purpose of a fan is in the first place ;)

Indeed. Besides, I wonder : when fans start to make less noise than our urban environment, are they that much of an annoyance ? I mean, I have read some reviews before buying my laptop, and the reviewers measured the noise during typical use to be 30 dB(A). In practice, road traffic as heard through my insulated windows is frequently more noisy, and yet the laptop remains cool. When we've reached that point, what are those few remaining dBs worth, save for the glory of technological achievement ?

Another problem people ignore is this; most CPU's and GPU's have a max temperature of 110degrees celsius so it is amazing the number of paranoid people wetting their pants when the temperature reaches 60 degrees celsius. For me I've pushed my MacBook Pro's CPU temp up to 95 degrees celsius without impacting on any of the other components.

Yes and no.

On one side, my experience tends to follow yours. My former desktop's CPU went as high as 90°C, due the mixture of inefficient technology (ah, the Athlon XP...) and an incredibly low-end noisy fan that couldn't be changed. And when I changed it for a more powerful one after 7 years, the sole piece of it which started to show its age was the failing HDD.

However, in theory, running CPUs at high temperatures comes with the risk of reducing their lifetime. So if you make your hardware last longer than mine and try to make it reach the theoretical lifetime of a CPU, which is around 20 years IIRC, you might find it to be only 10 or 15 years due to accelerated ageing. In the chip industry, lifetime measurements are typically done by heating up components way above their typical operating temperature in order to reduce their lifetime to something practical.

Plus, silicon is far from having the best thermal conductivity on Earth, and the CPU thermal sensor only reports the temperature at one point of the die. Peak temperature within the chip can be higher than the readings you get, so better take some security margins.

Finally, in small computers where space constraints make the air flow quite small (laptops, all-in-ones, mini-PCs...), a hot CPU can heat up some other component which *does* mind the increased temp. As an example, I've heard that HDD lifetime is reduced a lot if they get above 60°C, and that laptop batteries also HATE heat (which would be on of the main causes for premature battery death).

So although it works in practice, I do think that letting CPUs get too hot is a mistake which can cost the life of some components if you're not lucky, especially on lower-end hardware (think Acer notebooks).

That being said...

Oh well, I must remind myself not to get myself dragged into endless debates ;)

Me too ;)

Edited 2011-05-04 16:20 UTC

Reply Parent Score: 1