The cooperation between the XGL and AIGLX projects to bring better interfaces for the Linux desktop continues as David Reveman (Novell) of XGL has agreed to adopt many changes from the AIGLX project sent in by Kristian Hogsberg (Red Hat).
The cooperation between the XGL and AIGLX projects to bring better interfaces for the Linux desktop continues as David Reveman (Novell) of XGL has agreed to adopt many changes from the AIGLX project sent in by Kristian Hogsberg (Red Hat).
I have yet to try any of them, but the screencasts are way cool!
Cooperation is one of the great possibilities with open source developement… It should be done more often. :o)
Both are great…cooperation and competition…
But i prefere cooperation
I prefer both. OSS is one of the few places where competition and cooperation can go hand in hand.
will not require such beefy hardware? Isn’t AIGLX teh tech behind teh Luminocity videos from a year ago?
The reason I ask is because I have heard many problems with the XGL/Compiz tech reguarding older GFX hardware.
There should not be graphics effect that AIGLX can do and XGL can’t, and the reverse. At the end, all this work is done just to offload work to the GPU. So when all this work is really finished (1-2 years at least) what is going to limit the amount of graphic effects you can get is your graphics card regardless of aiglx or xgl.
Also notice that both aiglx and xgl are not the final solution, they’re just different methods to make easier to evolve x.org to something like xegl, AFAIK (being xgl the most radical and the one that needs more work)
Edited 2006-03-02 13:59
There should not be graphics AIGLX effect that XGL can’t do, and the reverse. At the end, all this work is done just to offload work to the GPU.
This is why libCM is becoming some kind of standard between them.
So when all this work is really finished (1-2 years at least) what is going to limit the amount of graphic effects you can get is your graphics card regardless of aiglx or xgl.
HW is not the limit. You actualy don’t need a lot of it for basic composing.
Also notice that both aiglx and xgl are not the final solution, they’re just different methods to make easier to evolve x.org to something like xegl, AFAIK (being xgl the most radical and the one that needs more work)
Nope. XeGL is something completely different. It should drive the same thing, but still completely different.
I really hope I will put this correctly.
XeGL is 3d drawing (where all 2d is handled trough 3d, without 2d handling possibility) in base server and footprint is small. Trouble with all legacy hardware, but ideal for embedded and some specific hardware.
AIXGL is using 3d where it needs and it runs in base server. It even allows disabling and rendering trough the 2d. Meaning legacy hardware should work.
XGL is overlayed 3d server. and is ideal for other purposes not covered by first two.
XeGL is not final solution. XeGL is just one of the solutions (and will be probably used in a lot of implementations), but I suspect XeGL is not the desktop material. In my opinion, best tech for desktop is AIGLX.
Final solution as you called it (or the best of evolution) will be option to pick the most suited and the best working of three for your specific case and use. And I seriously doubt that you’ll be running same one on different implementations.
There many possible software strategies for the Linux desktop to take advantage of GPU off-loading. I still don’t think an consensus has been achieved on this point.
1) Fork – the current X server is mature and it fully supports 2D hardware just fine. It’s not too hard to add new 2D drivers to it. For OGL capable hardware build a new server like Xegl with an xlib compatibility layer. The new server is completely dependent on OGL and has removed all of the ancient 2D support.
2) New server only – Build a new server like Xegl but port support for all of the old 2D hardware to it by using software Mesa to emulate OGL hardware. Single code base but a lot of effort will need to be put into software Mesa.
3) Build both a 2D and OGL based server in the same app. That’s what we are doing right now.
There are many diverse opinions for and against each of these strategies. I personally favor the first one since I believe it requires the least coding effort. The third one requires the most effort in the long run but it can be built incrementally.
I also think that it is a futile effort to try and support both the 2D and OGL hardware from a single server. It makes the server too big for the low end 2D users and it forces unnecessary constraints onto the high end OGL user. It is a simple observation to make that these OGL based composition manager demos are never going to run on old 2D hardware.
I do get upset when I hear the political arguments that we shouldn’t “screw the low end”. Looks to me like the low end is getting all the support right now and it’s the high end that is getting screwed. That argument is like telling owners of SCSI hardware that they have to run it in IDE emulation mode since it would be “unfair” to the IDE owners.
just like aiglx is using the pixmap-to-texture extension and want to use compiz effects aswell, it seems to me that somehow media has managed to make people think that there’s a “confrontation” between XGL vs AIGLX (just like SELinux vs apparmor) and people has spoken about “redhat vs novell” etc, when the fact is that is that this kind of competition is the rule (kde vs gnome, java vs mono, dbus vs dcop, gstreamer vs the alternatives) not the exception
If by “wanting to use compiz effects as well” you mean “merge teh 3d effects into metacity” then yes.
From what I have read on teh ubuntu forums is that the developers in both camps expect all this work to get put into metacity and as stated earlier, x.org.
If by “wanting to use compiz effects as well” you mean “merge teh 3d effects into metacity”
Yes, sorry for that
Compiz is an excellent WM in it’s own right, and a lot of the code for compiz is from Metacity, just modularized. Having used compiz and looked through the source, it appears that compiz is cleaner and more powerful than metacity.
With compiz, you are able to dynamically load and unload shared moudles which governs the effects of the compositing manager. The settings that these modules uses are grabbed by a configuration backend, the one currently in use is gconf.
This is actual colaboration between different projects and will definately help both of them and to us USERS.
Kudos for taking such a step.
This is great news. A few days ago (maybe I was uninformed) I thought there would be some kind of war between to two camps. It’s a great thing to see collaboration. Especially for all the parties involved (including me as a consumer).
Good luck, the project is coming along nicely. I applaud the efforts.
The people who sell magazines or ads on their web sites get more ad views from controversy. The UK’s Register takes that approach to an extreme; when one of their columnists writes something outrageous and false, everyone links to it, everyone checks out the site to see what the outrage is about; they all write in asking for the writer (usually Andrew Orlowski) to be fired, and the owners give old Andrew a bonus for raking in all the ad revenue.
So, “KDE and GNOME at each others’ throats!” gets more hits than “KDE and GNOME cooperate”, even though the latter is more common than the former.
For example, I submitted a Slashdot story describing the fact that Trolltech is adopting GNOME’s glib for their event loop, so that GNOME and KDE code can be freely mixed in the same application, and that this will probably be in Qt 4.2. Once that happens, the war is pretty much over; everyone can freely mix and match KDE and GNOME code. Slashdot rejected the story. But any time a KDE advocate (often a non-developer) slams GNOME or vice versa, Slashdot will run the story.
See
http://www.scheinwelt.at/~norbertf/common_main_loop/
Exactly, I agree. OSNews and Slashdot are actively provoking the community. Recently Gnomedesktop.org published an article which explains X’s background and how AIGLX and XGL actually share a lot of code with each other, and neither Slashdot nor OSnews publish the article. The community receives too much bad news, which makes people flame at Linux and OSS all the time.
Just a small correction. You probably meant this blog entry: http://blogs.qtdeveloper.net/archives/2006/02/24/qt-and-glib/.
AFAIK this has nothing to do with nf2’s work (your link).
Otherwise I agree with what you wrote.
very wise decission on Novell’s part. I guess big corporations are not always evil … rolf
No seriously I respect Novell for being very reasonable and trying to avoid another double standard. I hope RH can be just as wise because linux and the OSS community already has enough forks and projects that do the same thing.
Novell, even though they have made some boneheaded moves in the past, have really been doing a great job in the past couple years. If we can continue to see more cooperation between Novell and Red Hat, then I think we’ll see alot of great advances in Linux in the near future from those two.
The technology in Vista blows away some hacked up XGL and whatever crap that RedHat puts out. Linux is great at putting out technology demos with no thought to the programming model to actually give app developers something to use.
Linux just continues to fall further behind windows and osx. That’s fine for hobbyist kids and for the server, but not for professionals.
The technology in Vista blows away some hacked up XGL and whatever crap that RedHat puts out.
Oh really? You mean Windows Vista right? The OS that hasn’t shipped yet.
Linux is great at putting out technology demos with no thought to the programming model to actually give app developers something to use.
App developers have nothing to do with this. The only changes necessary are drivers, X, and a compositing window manager.
Linux just continues to fall further behind windows and osx
That’s funny because I haven’t found the options for the hardware accelerated/3D desktop in XP yet. You mind showing me where those options exist?
GDI+ uses hardware acceleration for many things, so yeah, I’d say that Linux is behind it currently as well.
Vista is something like 8-9 months away from shipping, and is already feature-complete. The betas that have been released so far are quite usable, just buggy.
Whether you like it or not, Linux *is* currently behind, and will be behind for the next while.
If you are using Ubuntu dapper, Xgl is in the universe repo, and is available now. Having been using it for the past few weeks, I can vouch that the technology is complete and *stable*.
I have used Vista builds in the past few months and honestly, the Aero glass interface not only has inferior effects than what is provided in compiz, but the destop performance was also inferior as well (Apart from window resizing performance, which aero handles better).
> I can vouch that the technology is complete and *stable*.
Thanks for the laugh. 🙂
It is *far* from complete.
That said, I haven’t had any stability problems whatsoever.
So many Linux folks continue to ignore this reality. X11 is just plain slower than Windows. I have yet to use a Linux desktop that is snappier than Windows, even XP with all the eye candy turned on. I’ve used a multitude of different configurations and not one has even matched Windows in terms of snappiness and lack of flicker. I’m typing this right now on a nice new shiny ThinkPad T43 with 1 gig of RAM, 2 GHz Pentium-M and a 64 MB ATI Mobility x300. And when I drag windows, trails get left behind. I see flicker with Qt apps and sometimes with GTK apps. There is flicker when flipping desktops. Windows on the same machine is very snappy and there’s never flicker and slowness. Add to that the fact that half of the features of X don’t work, or require hours of hacking around with config files, drivers, CVS builds and such to get working, I think I can safely say that Windows just plain does a better job than X11. It’s sad too, because X11 has a nice protocol, just the implementation sucks.
> So many Linux folks continue to ignore this reality.
Siride,
I hope you believe me when I say that I am sincerely baffled when I hear people say this.
I use Linux/X on my own desktop. My machine is pretty respectable and the card is a pretty nice AGP card. So it’s not surprising that I find my own desktop to be pretty snappy.
However, my major business is supporting clients running Linux terminals running off of Linux desktop servers via XDMCP sessions over 100mbit ethernet and over WAN connections using the Nomachine NX protocol.
The remote clients are usually configured to use the vesa driver for a number of reasons. (My custom CentOS kickstart CD set uses very conservative settings for maximum compatibility, since I want my clients to be able to install on new workstations without my help.)
At any rate, with the vesa driver there is no hardware acceleration for 2d, let alone 3d. A number of the Linux boxes I have out there are converted win98 machines with 64mb RAM and whatever crufty old video chipset that came installed in them. Others are new machines.
We run Gnome desktops from a CentOS 4.2 server.
And screen update performance has simply never been an issue. In fact, the only comments that I have gotten from users wrt performance regard how much *faster* they are after the conversion. (This is not a gui issue, but a result of the server being faster than their machine.) I would consider this a worst case scenario for X.
On the remote boxes, I can tell that I am not running on the server console. But I would hardly say that there is any usablility problem.
It is simply not an issue for me and my customers.
I’ve come to the conclusion that people must mean that with X you can sometimes notice that something was not absolutely instant.
All I care about is that everyone has usable desktops. “Improve screen update performance” is item 137 on my todo list. Why do people nitpick so?
Is there some combination of hardware and software out there in which X responsiveness is a significant problem that I have simply never run across? I’d have thought that the vesa driver on an old machine via remote XDMCP would be about as bad as it could get.
So many Linux folks continue to ignore this reality. X11 is just plain slower than Windows. I have yet to use a Linux desktop that is snappier than Windows, even XP with all the eye candy turned on.
Did you use ATI driver on both Windows XP and a desktop manager to validate the comparison?
And when I drag windows, trails get left behind. I see flicker with Qt apps and sometimes with GTK apps.
Sound like a driver issue. Did you use generic or ATI driver?
Just to experiment, use only generic driver on both Windows XP and your favorite DE on your Linux distro and see if you statement is valid. Seen your laptops, it is clear vendors already include proprietary driver for hardware acceleration. Also, specify what kind of distro did you use on your laptop.
Addendum: ATI drivers for Linux distros users are known to perform poorly due to ATI lethargy AFAIR.
Edited 2006-03-03 06:53
So many Linux folks continue to ignore this reality. X11 is just plain slower than Windows
You mean, the windowing system that worked on 386 PCs ? And you think you’re credible ?
So many trolls continue to ignore this reality : Windows is just plain slower than X11.
I have yet to use a Linux desktop that is snappier than Windows, even XP with all the eye candy turned on
Let me tell you a story : despite years of using Windows 9x, I understood what was a snappy desktop when I saw some Unix guy use twm on Linux and XFree86.
In the time it took you to move your mouse and launch an app, he had launched 3 and started working in one.
I’ve used a multitude of different configurations and not one has even matched Windows in terms of snappiness and lack of flicker
So, given my experience, I can tell you you’re wrong. Anyway, when I see the locks I have on WinXP SP2 right now on a P4 3+ GHz 1 GB RAM I use at work, as soon as Windows experience a little CPU or memory load (with 1 GB RAM, amazing), or sometimes without any load, I would not brag about Windows if I were you.
You claim Windows is snappy and have no flicker, while even moving notepad right now shows tearing, adn some apps show trails.
My Linux desktop experience at home is constant even with big loads (2 simultaneous compilation, given that I always have 3 different desktops loaded, and only 1 GB RAM), and all the desktops I run are 1600×1200 desktops.
I’m typing this right now on a nice new shiny ThinkPad T43 with 1 gig of RAM, 2 GHz Pentium-M and a 64 MB ATI Mobility x300. And when I drag windows, trails get left behind. I see flicker with Qt apps and sometimes with GTK apps
Stop lying please. BTW, Gnome is double buffered, so you can’t see tearing or flicker on Gnome. And the apps you talk about are very specific apps that take time to redraw part of their screen.
There is flicker when flipping desktops. Windows on the same machine is very snappy and there’s never flicker and slowness
You’re right not to talk about Windows flipping desktop ability, you’re better not.
The truth is that flicker when flipping desktop is at worst not noticeable, that it is not even a real problem. Dragging Windows is not even something people do constantly, only unproductive trolls move windows all day long. The fact is that in Windows, most people run apps fullscreen, few people actually use drag and drop, and most use cut and paste.
And people do the same when they move to Linux, that’s why people complained when Gnome moved to spatial : it involved using more drag and drop instead of cut and paste.
The fact is that in Gnome or KDE, when you log in, the desktop is there how you left it, and so, people never need to move windows around. So your tired straw man of why X11 is inadequate/slow/whatever is just stupid, but you want people to think it’s a real problem.
Before trolls told me that moving OOo on Firefox would make trail (if you move the windows fast to add to the stupidity of this test), I would never have realised it did, because I NEVER had any incentive to do that, so this is not even a real problem.
Add to that the fact that half of the features of X don’t work, or require hours of hacking around with config files, drivers, CVS builds and such to get working, I think I can safely say that Windows just plain does a better job than X11. It’s sad too, because X11 has a nice protocol, just the implementation sucks
But mostly you’re just a moron. Of course, because all you say there is just false. That’s your problem you lose hours with CVS builds as a user. Even I don’t do that, and I made my own Linux OS at home. And sure enough all the features of X work at home, and on all the Mandriva I installed for my users, without losing hours in config files and CVS builds.
GDI+ uses hardware acceleration for many things, so yeah, I’d say that Linux is behind it currently as well.
Errr, anything using OpenGL can be hardware accelerated on a Linux system. Hardware acceleration is transparent to GDI and applications using OpenGL.
Vista is something like 8-9 months away from shipping, and is already feature-complete. The betas that have been released so far are quite usable, just buggy.
Whether you like it or not, Linux *is* currently behind, and will be behind for the next while.
Windows Vista is Windows XP with a 3D desktop that will require some pretty hefty 3D hardware, and I know, because we have MSDN subscriptions and we get the releases. The only other thing it seems to have is some multimedia stuff, and the interface is obviously as a result of a great deal of soul searching within Microsoft as they wish for something that Apple has and they haven’t got – style.
Goodness knows what it will consume when people use the full 3D, hardware accelerated desktop and then run a full 3D game on top of it. There is absolutely nothing that is revolutionary or different about it from a usability or functionality point of view. It’s the same old update to Windows, and when it gets released people will say “Oh right” and then carry on with what they were doing before, just like they did with Windows XP and just like they did with Windows 2000 when that was promised as an uber advanced OS for the next ten years.
I’d actually say that Linux is going to end up being ahead, because the approach of XGL and AIGLX is to be able to use compositing in an efficient manner so you won’t need full hardware acceleration for everything.
Windows Me
“Windows Me: PC Health Features Keep PCs Stable, Secure and Reliable — and Take the Frustration Out of Computing for Home Users”
http://www.microsoft.com/presspass/features/2000/sept00/09-05winme….
Windows 2000
“Our primary goal is to improve security and safety for all our customers — consumers and businesses, regardless of size — through a balance of technology innovation, guidance and industry leadership,” Gates said. “We’re committed to continued innovation that addresses the threats of today and anticipates those that will undoubtedly emerge in the future.”
http://www.microsoft.com/presspass/press/2005/feb05/02-15RSA05Keyno…
Windows XP
“Windows XP is the most secure and dependable operating system we have ever produced.”
http://www.microsoft.com/presspass/press/2002/aug02/08-30WinXPSP1PR…
Windows Vista
“In Vista, it should be much more difficult for unauthorized programs (like Viruses and Trojans) to affect the core of the OS and secretly harm your system.”
http://www.extremetech.com/article2/0,1697,1931914,00.asp
If you’re lucky, all of the stuff in there may work by the next version of Windows.
I enjoy Linux just as much as the next guy, but honestly…
Vista is out, I have it on one of my computers, albeit it is a “beta” release.
XGL (and the like) is also out, I have it on one of my other computers, and just like Vista, it is a beta release.
I would like someone to step forward and make the claim that XGL (and the like) are stable and/or “finished”, and I will easily point out an idiot.
I would like someone to step forward and make the claim that XGL (and the like) are stable and/or “finished”
Finished? No. Stable? Quite, but of course YMMV.
Please.
Will you people drop the “hefty requirements” argument already? Refuting it is getting to be too painful to bear.
Heftier than XGL/AIGLX, it seems.
It also does a hell of a lot more than XGL/AIGLX. In any case, by the time Vista is out later this year, anyone who has a system that is overall Vista-worthy will have at least a Radeon 9XXX/GeForce FX original or equivalent. Even Intel’s GMA950 is capable of a lot of Vista effects.
Vista requirements are exaggerated. So it won’t run on your Voodoo 3 … boo hoo.
It also does a hell of a lot more than XGL/AIGLX.
Such as? The only real difference I’ve heard so far are pixel shaders, and I don’t see why these couldn’t be added to XGL. One thing we know is that the open-source desktop has developed faster than proprietary ones, so we’ll see where we’ll be when Vista comes out, and then a year later…
In any case this is quite off-topic. I’m personally happy to see cooperation between the two projects.
Will you people drop the “hefty requirements” argument already? Refuting it is getting to be too painful to bear.
I’m sure we could all run Windows XP on 128 MB of RAM, but of course, no one does. Why? Because getting the OS up and running is one thing, but getting it up and running, installing software and using it on a regular basis is something entirely different.
GDI+ uses hardware acceleration for many things, so yeah, I’d say that Linux is behind it currently as well.
I said hardware acceleration/3D. Those are not available for Windows yet and that is what XGL/AIGLX provides.
Vista is something like 8-9 months away from shipping, and is already feature-complete. The betas that have been released so far are quite usable, just buggy.
Well it is a lot easier to get your hands on XGL and is probably less buggy and more usable than Vista is right now.
Whether you like it or not, Linux *is* currently behind, and will be behind for the next while.
That’s a pretty poor attempt at disguising your opinion as an authoritative response on this matter.
you must be stoned to think vista will be out this year…
besides that, linux stuff is out now and will continue to be released. Microsoft does not have vista released and therefore cant even compete…
just cause you went and downloaded a copy off of a bittorrent tracker does not mean anything.
You need to be re-educated on your social responsibilty to not promote proprietary software.
The failure of Linux continue to become apparent when you compare the programming models and tools between Vista and Linux.
Not only is XGL totally unstable, but it doesn’t change anything at the higher level that application developers use.
Most of the Unix guys have moved on to OSX after all the years of failure on desktop linux. It was a nice experiment, but now it’s over, and wobbly windows don’t cover up linux failures.
What is it, exactly, that Vista supplies that XGL and Linux doesn’t? Details, please…
Vista supplies the entire WPF programming model. XGL is just an Xserver that uses opengl.
I know the difference between WPF and XGL. I agree that Windows is currently ahead, but I see strong evidence that Linux is catching up quickly. Basically, someone just needs to create a good library and the WPF advantage would be gone.
Wow, the Windows fanboys sure came out of the woodwork for this one. Funny considering that there’s no mention of Windows or Microsoft in the article. Almost makes you think that the windows apologists are starting to get scared of not being in the majority anymore.
Seriously guys, they’re just operating systems. Would it be so terrible to just admit that they both have their strengths?
Wow, the Windows fanboys sure came out of the woodwork for this one. Funny considering that there’s no mention of Windows or Microsoft in the article. Almost makes you think that the windows apologists are starting to get scared of not being in the majority anymore.
Doesn’t stop the linux toddlers from coming out on every windows article. Tit for tat, shitstain.
Almost makes you think that the windows apologists are starting to get scared of not being in the majority anymore.
Haha, with OSX at 2.5% and Linux at 2% or less, maybe in the next century windows won’t be in the majority.
I am very impressed by the progress that has been taking place in order to offload work to the GPU. I am also very impressed that the open source model once again shows it’s strength as witnessed by this annoucement of cooperation and code sharing between these two (quote/ unquote) *competing* projects. At first glance, it only makes sense. Why write 6000 lines of code from scratch when somebody has already done the work? This demonstrates that even projects with somewhat diverging aims can greatly benefit each other – and isn’t that what open source is all (or at least partically) about? Bravo.
I am not equiped to comment on the code patch in question, nor on the validity of application for these projects. All I am is an interested end user that would like a more responsive yet full-featured GUI experience on linux. Though the realization of my dream is probably years away, I am nonetheless thrilled that the *process* by which this end is being met is progressing in what I would deem a good way.