Linked by Thom Holwerda on Sun 10th Oct 2010 14:17 UTC, submitted by Extend
Ubuntu, Kubuntu, Xubuntu Yes, yes, it's that time of the year again - a new Fiona Apple album confirmed (which makes anything that happens between now and spring 2011 irrelevant and annoying), MorphOS 2.6 released (will be the next news item), and, of course, a new Ubuntu release showcasing the best of the best that the Free software world has to offer in the desktop world.
Thread beginning with comment 444778
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[5]: Solid release
by Neolander on Mon 11th Oct 2010 14:05 UTC in reply to "RE[4]: Solid release"
Neolander
Member since:
2010-03-08

Looks are important I'm afraid. That's why OS X and Vista/7 look the way they do and why they've started using hardware acceleration and resolution independence amongst other things.

Hum, not exactly, in my opinion...
-Hardware acceleration is the last refuge of devs who don't know how to write well-optimized code. E17 shows best how much GPU acceleration is badly overrated, in my opinion. You can make something good-looking, and even pack it with superfluous and hideous eye-candy as a technological demo using raw CPU power and still don't get a single glitch... That is, if you know how to code.
-Resolution independence in OSX and Win7 ? Well, sure, there's a bit of it in there, but close to none. Since when did they make applications (ie most of the user experience) resolution-independent ? Or even provide an API for that ? Since when is input resolution (ie making controls bigger when using an imprecise interface like a touchscreen or a pen tablet and smaller when using a more powerful interface like a mouse) taken into account in their resolution-independence algorithms ?

Edited 2010-10-11 14:08 UTC

Reply Parent Score: 1

RE[6]: Solid release
by nt_jerkface on Mon 11th Oct 2010 14:49 in reply to "RE[5]: Solid release"
nt_jerkface Member since:
2009-08-26

Hardware acceleration is the last refuge of devs who don't know how to write well-optimized code.


It makes sense for a lot of effects to be handled by gpu. If the gpu already knows how to process effects like fading, shadow and transparency then the most efficient code is to pass it off. You do not want to needlessly waste cpu cycles doing transparency calculations.

Reply Parent Score: 4

RE[7]: Solid release
by Neolander on Mon 11th Oct 2010 17:52 in reply to "RE[6]: Solid release"
Neolander Member since:
2010-03-08

It makes sense for a lot of effects to be handled by gpu. If the gpu already knows how to process effects like fading, shadow and transparency then the most efficient code is to pass it off. You do not want to needlessly waste cpu cycles doing transparency calculations.

What are you calling "efficient" exactly ? On battery-powered computers (and even on the desktop, with today's environmental concerns), consuming little power is a very desirable form of efficiency.

CPUs have a relatively small power consumption, and are needed by every software anyway. On the other hand, a GPU consumes easily 3x as much as a CPU when under load. Is it really efficient to have the computer's power consumption go x4 for the sake of drawing unreadable translucent windows ?

Now, let's examine what you're mentioning as GPU use cases.
-Small fading effects are not very intensive and they occur only once in a while so they can be handled by the CPU with no major performance hit for applications. I disagree with the alleged need for a GPU there
-Translucent things, on the other hand, eat up power ALL THE TIME. Anytime you open a menu, anytime you open a window, you have to redraw the whole layer stack when translucency is on. So in that case, you're right, a GPU is welcome (even though not needed. Again, look at E17's shadows. And keep in mind that applications are perfectly responsive on top of it, more than on Gnome+Compiz in fact)

But now, let's consider what that translucency is used for :
-Windows 7's unreadable windows borders and task bar (your mileage may vary depending on your wallpaper's colors but in my case the result was so awful I just had to disable it and mentally thank Microsoft engineers for providing the option to do so, although they could just as well have made the Basic theme less crappy).
-OSX's unreadable menu bars and distracting menus.
-Shadows that no one except geeks will ever notice.

See where I'm going ? When transparent effects are used, it's either to hurt usability or to go unnoticed except for the much reduced battery life it leads to. I've yet to see a case where transparency is used wisely on a GUI.

Now, please not that I'm not one of these spartans who want every single OS to look as dull as RISC OS' GUI. I sure love nice-looking UIs a lot, and am not against the use of special effect as long as it's done properly. As horrified as I was by Windows XP's "fisher price my first operating system" look, I'm fond of the vista-7 look as soon as a few things are done to make it look better, be more usable, and work in a smoother fashion (e.g. disabling window and taskbar translucency). I love to have those overused shiny gradients on my buttons and scrollbars, and think that the progress bars especially are quite nicely done. And I love the right to have my windows painted in any color I like, too (that's why I don't switch to the Basic theme, by the way, since MS recently decided that changing a window's color was highly computationally expensive and required a GPU to be done properly).

But really, you don't need a GPU to do all that. It's basic use of gradient and animations, with a few fading here and there. A lot of them can be rendered in advance and just blitted on screen as needed, and the rest can be rendered in real-time with little to no performance hit (and no responsiveness hit at all if you know how to do scheduling).

On my laptop, I have that dual-gpu thing called Optimus by NVidia. You can save an hour of battery by switching to the intel GPU for most work, and going back to the nvidia GPU when needed. No, each time I use it, I think that I'd rather take an extra hour of battery by not using GPU rendering, since as I just showed it's not needed. How is that wrong ?

Edited 2010-10-11 17:55 UTC

Reply Parent Score: 4