Linked by Thom Holwerda on Mon 9th May 2011 21:14 UTC, submitted by Elv13
Qt Since Nokia announced its switch to Windows Phone 7, people have been worried about the future of Qt. Well, it turns out Nokia is still going full steam ahead with Qt, since it has just announced the plans for Qt 5. Some major changes are afoot code and functionality-wise, but the biggest change is that Qt 5 will be developed out in the open from day one (unlike Qt 4). There will be no distinction between a Nokia developer or third party developer.
Permalink for comment 472604
To read all comments associated with this story, please click here.
RE[2]: Meh
by Neolander on Tue 10th May 2011 20:26 UTC in reply to "RE: Meh"
Neolander
Member since:
2010-03-08

You should brush up on how graphics works these days. It's not at all about "optimizing" - it's about creating a graphics model that works as fast as possible on modern hardware, instead of "generic" model that works okay everywhere.

Speed does matter, but it's not the only thing that matters. Things like battery life, stability and portability do matter to, in the context of QT. Sadly, some of the platforms which QT does run on simply do not have stable, efficient GPU drivers for recent hardware, because of the way the GPU ecosystem works (particularly on the desktop). In fact, considering how easy it is to crash my GPU drivers on Windows, and this driver update which broke something as basic as fan operation on high-end graphic cards some time ago, I have to wonder is such a thing as a reliable GPU driver for modern hardware does exist on any platform...

GPU is always on anyway. You can waste it and burn the power on CPU instead, or get with the program and use that GPU. Using GPU over CPU prolongs battery life and makes the applications look better.

I don't think it's impossible to turn off a GPU. I think NVidia's Optimus software does it, and someone (I think it's oahiom) told me that on AMD and Intel GPU, there even is a fine-grained power management model allowing one to shut down parts of the GPU (e.g. individual execution units).

While if you can't turn GPU off, the most efficient thing to do is indeed to use them, turning off GPUs, or parts of them, is totally worth it when CPU power is sufficient. I haven't made a pure comparison of software rendering with all GPUs off (or in their minimal power state) and GPU-accelerated rendering yet, but I can already tell what the difference is between an idle Intel GPU and an idle NVidia GPU : the battery life of my laptop is halved in the latter case. This is a pretty impressive result, because it means that even if we neglect the power consumption of the Intel IGP, when it does nearly nothing, an NVidia GPU eats up as much power as all other power-hungry components of a modern laptop (screen, HDD, wireless, CPU...) combined !

A more debatable example would be a comparison between smartphones with and without GPUs : those without GPUs generally provide pretty smooth graphics too. You lose the shiny effect, granted, but you can also get hardware at half the price but with twice the battery life (3 days vs. 1,5 days).

What this shows is that while CPUs' power management features have dramatically improved in the past few years, high-end GPUs are still the power-hungry beasts that they were in the days of graphics accelerators : something suitable for gaming or high-end graphics, but not for anything which has a battery plugged in it. When I see everyone embracing them for trivial tasks, I do think that the computing world has really gone completely mad.

Reply Parent Score: 1