Linked by Thom Holwerda on Sun 10th Oct 2010 14:17 UTC, submitted by Extend
Ubuntu, Kubuntu, Xubuntu Yes, yes, it's that time of the year again - a new Fiona Apple album confirmed (which makes anything that happens between now and spring 2011 irrelevant and annoying), MorphOS 2.6 released (will be the next news item), and, of course, a new Ubuntu release showcasing the best of the best that the Free software world has to offer in the desktop world.
Thread beginning with comment 444830
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[9]: Solid release
by Neolander on Tue 12th Oct 2010 06:13 UTC in reply to "RE[8]: Solid release"
Member since:

A GPU can handle 2D and 3D drawing with greater efficiency than a general purpose CPU due to specialization. GPUs do not have to have the same overhead required in modern x64 cpu. If you think fancy effects are a waste of power then that is a completely different subject.

You don't need a GPU to do 3D drawing either but it sure helps since it is a cpu built for that purpose.

The right tool for the right job. A GPU can crunch more numbers for heavy vector and texture calculation, yes. That's why we use it in areas which need that, like some games, graphic applications, and recently scientific calculation software. What I question is the need for it joyfully wasting electrical power and heating up everything in something as basic as UI rendering. Take the average IE/Word user, wouldn't he rather take battery life and laptop hardware lasting longer ?

In Win7 Aero stays on in power saving mode but transparency is disabled since that is what takes the most power. It doesn't have to be an either/or solution. The gpu should at the very least be taken advantage of if the computer is plugged in.

Yes, plugging the computer in solves the power consumption problem. But then, something else comes : heat and the reduced hardware lifetime that comes from it. If we take a Word/IE user who doesn't play any other game than Solitaire, we have to wonder : wouldn't he rather take much cheaper hardware without GPU acceleration at all ? Hardware which despite costing less would last longer, because contrary to most cheap hardware it would not overheat that easily.

I'm not sure why you think E17 is the answer when it looks dated compared to KDE 4.5 and is developed at a glacial pace. Sure it looks better than Gnome but that isn't saying much.

It's not about E17's look (E17 is horrible as far as look is concerned, we agree, although some themes look quite better than the default one). I'm talking about rendering technology here.

E17 shows what software rendering is capable of without a major performance hit, and is used here to silence people who blame things like the horrible performance of Metacity and software KWin on the limited capabilities of the CPU, and ask for a GPU when it's not needed. Simply considering that the early Macintosh already managed to run a GUI shows how much modern CPUs are really up to, when used properly.

As time passes, people who want to use Word and browse simple websites (webmails, news...) have been asked to buy more and more powerful hardware to do the same very thing, or otherwise suffer a crappy performance. Sometimes it was justified (like, say, when memory protection was introduced), most of the time it's not. What I'm against is unjustified power waste.

If my OS manufacturer tells me "I'm introducing a new capability-based security system that's more safe and efficient for desktop use than the current user/admin model, but it's a bit power-intensive because of the need to parse complex databases so minimal specs will rise a bit", I say "please do". If he tells me "Look, I don't know what to do in this release so I'm going to introduce extremely power-intensive transparency algorithms in the UI. It won't be any easier to use, and arguably will look bad, but you now need a supported GPU in the minimal specs and your battery life will drop by 20%", then I'm just going to slap him in the face and go look for a more serious OS vendor, rightfully in my opinion.

I also don't think you could push a return to cpu-only effects when the browsers are moving towards gpu rendering.

Again, it's a matter of using the right tool for the right job. I'm all for web browsers getting similar capabilities as Flash as far as gaming in concerned, but for my daily web browsing I'd prefer the GPU to stay off. What I'm against in GPU rendering in browsers is when you have to keep it enabled all the time, because the "fallback" software algorithms are just as horrible-looking as the Basic theme of Vista and Windows 7, the developers having not even bothered to test them for readability apparently. If people want their computer to last 2 hours on battery when browsing the web, it's fine, but in my case no thanks.

It makes more sense to have a windowing system feed the gpu frames just as with a 3D game,


especially when open source desktops have limited developers. They shouldn't be bothered with optimizing fade or transparency routines when the gpu already knows how to do them.

Look, to me this looks like a good example of wasted resources, in development this time. If the open source desktop has enough development resources to make KDE 4, Pulse Audio, and Compiz, it has enough resource to make E17 a mature windows manager and integrate it in all desktop environments.

It's funny you mention the open source desktop, because I thought about mentioning it as an example of why putting a GPU everywhere is bad.
GPUs are the perfect example of effort duplication. Hardware manufacturers couldn't bother to think about standard interfaces for software in their little holy war, so you almost have to make drivers on a per-chipset basis. The open source desktop is not a very interesting target for them, and to make things worse X11/Xorg is so horrible that its devs feel an urge to break it every few months, so the whole HW accelerated graphic stack of the open source desktop is horrible. Fixing it would require dozens of years of hard work.

And you want to make vital software like UI rendering and web browsers accelerated, knowing that software rendering will never be enough tested because devs will think "ha, users all have a working accelerated graphic stack anyway" ?

This is, in my opinion, just perfectly wrong.

Edited 2010-10-12 06:15 UTC

Reply Parent Score: 2