Linked by Thom Holwerda on Thu 2nd Mar 2006 12:58 UTC, submitted by Rahul
X11, Window Managers The cooperation between the XGL and AIGLX projects to bring better interfaces for the Linux desktop continues as David Reveman (Novell) of XGL has agreed to adopt many changes from the AIGLX project sent in by Kristian Hogsberg (Red Hat).
Thread beginning with comment 100700
To read all comments associated with this story, please click here.
modmans2ndcoming
Member since:
2005-11-09

will not require such beefy hardware? Isn't AIGLX teh tech behind teh Luminocity videos from a year ago?

The reason I ask is because I have heard many problems with the XGL/Compiz tech reguarding older GFX hardware.

Reply Score: 1

diegocg Member since:
2005-07-08

There should not be graphics effect that AIGLX can do and XGL can't, and the reverse. At the end, all this work is done just to offload work to the GPU. So when all this work is really finished (1-2 years at least) what is going to limit the amount of graphic effects you can get is your graphics card regardless of aiglx or xgl.

Also notice that both aiglx and xgl are not the final solution, they're just different methods to make easier to evolve x.org to something like xegl, AFAIK (being xgl the most radical and the one that needs more work)

Edited 2006-03-02 13:59

Reply Parent Score: 4

somebody Member since:
2005-07-07

There should not be graphics AIGLX effect that XGL can't do, and the reverse. At the end, all this work is done just to offload work to the GPU.

This is why libCM is becoming some kind of standard between them.

So when all this work is really finished (1-2 years at least) what is going to limit the amount of graphic effects you can get is your graphics card regardless of aiglx or xgl.

HW is not the limit. You actualy don't need a lot of it for basic composing.

Also notice that both aiglx and xgl are not the final solution, they're just different methods to make easier to evolve x.org to something like xegl, AFAIK (being xgl the most radical and the one that needs more work)

Nope. XeGL is something completely different. It should drive the same thing, but still completely different.

I really hope I will put this correctly.

XeGL is 3d drawing (where all 2d is handled trough 3d, without 2d handling possibility) in base server and footprint is small. Trouble with all legacy hardware, but ideal for embedded and some specific hardware.

AIXGL is using 3d where it needs and it runs in base server. It even allows disabling and rendering trough the 2d. Meaning legacy hardware should work.

XGL is overlayed 3d server. and is ideal for other purposes not covered by first two.

XeGL is not final solution. XeGL is just one of the solutions (and will be probably used in a lot of implementations), but I suspect XeGL is not the desktop material. In my opinion, best tech for desktop is AIGLX.

Final solution as you called it (or the best of evolution) will be option to pick the most suited and the best working of three for your specific case and use. And I seriously doubt that you'll be running same one on different implementations.

Reply Parent Score: 5

jonsmirl Member since:
2005-07-06

There many possible software strategies for the Linux desktop to take advantage of GPU off-loading. I still don't think an consensus has been achieved on this point.

1) Fork - the current X server is mature and it fully supports 2D hardware just fine. It's not too hard to add new 2D drivers to it. For OGL capable hardware build a new server like Xegl with an xlib compatibility layer. The new server is completely dependent on OGL and has removed all of the ancient 2D support.

2) New server only - Build a new server like Xegl but port support for all of the old 2D hardware to it by using software Mesa to emulate OGL hardware. Single code base but a lot of effort will need to be put into software Mesa.

3) Build both a 2D and OGL based server in the same app. That's what we are doing right now.

There are many diverse opinions for and against each of these strategies. I personally favor the first one since I believe it requires the least coding effort. The third one requires the most effort in the long run but it can be built incrementally.

I also think that it is a futile effort to try and support both the 2D and OGL hardware from a single server. It makes the server too big for the low end 2D users and it forces unnecessary constraints onto the high end OGL user. It is a simple observation to make that these OGL based composition manager demos are never going to run on old 2D hardware.

I do get upset when I hear the political arguments that we shouldn't "screw the low end". Looks to me like the low end is getting all the support right now and it's the high end that is getting screwed. That argument is like telling owners of SCSI hardware that they have to run it in IDE emulation mode since it would be "unfair" to the IDE owners.

Reply Parent Score: 5