Updated: Fedora was right in the middle of announcing all this properly, so here is the updated item containing the official names. Videos included, as well as the inevitable ‘Why not Xgl?’. “AIGLX is a project that aims to enable GL-accelerated effects on a standard desktop. We have a lightly modified X server (that includes a couple of extensions), an updated Mesa package that adds some new protocol support and a version of metacity with a composite manager. The end result is that you can use GL effects on your desktop with very few changes, the ability to turn it on and off at will, and you don’t have to replace your X server in the process.” This is part of Fedora’s Rendering Project, and instructions on how to install all this are available too.
Hi,
what are the differences to XGL and Compiz? And: Are there any screenshots of Bling?
Greetings
Mike
“what are the differences to XGL and Compiz? ”
Compiz is basically reinventing the wheel by Novell
http://mail.gnome.org/archives/metacity-devel-list/2006-February/ms…
XGL is a different approach to XeGL which happened directly in Xorg CVS and multiple vendors except Novell working on it
yeah like metacity wasn’t reinventing the wheel when it was created? there were like 100 different really powerful window managers they could have used. hell edge resistance is a “new” feature in metacity. its been in other WMs for like 10 f’n years.
The goal of Metacity was not to create a really powerful window manager. The goal was to create an easy, unobstructive window manager. And the reached that goal.
To quote Antoine de Saint-Exupéry: “Perfection is reached, not when there is no longer anything to add, but when there is no longer anything to take away.”
and multiple vendors except Novell working on it
Not just novell, several people from the x.org community helped to merge xgl in the x.org tree
Please can we have something that WORKS perfecty stable, instead of metacity compositor or xcompmgr or Bling. I’m getting sick of seeing composite managers poping up saying the same story, work togetther on this for god sake.
Xgl is great but it’s just not good enough for games and a few other things(experiemtal), so please make Xgl work perfectly and forget composite managers.
didn’t you realize that COMPiz is a COMPosite manager/window manager?
Didn’t you read the website? Go read the “How is this different than XGL?” paragraph.
Yes, you are absolutly right. So why don’t you create something instead of complaining? Let’s see your contribution.
Edited 2006-02-20 23:07
I don’t care for visual effects and beauty of an interface. The only thing which I like is performance, simple and easy to use interface. ThankGod RedHat is focusing on there interface.
I love Linux but the only reason that I don’t use it as my desktop OS is because of its slow response in GUI. I use its CLI and I use Linux only for programming and other work. Windows GUI is really fast if compared with GNOME or KDE.
I am happy that Fedora community is focusing on there interface. I think all of the distributions must focus on there GUI performace.
Edited 2006-02-20 13:22
Same here… Gnome is my favourite, but it’s performance is unacceptable. Too bad. I would gladly install Ubuntu or Suse on my mother’s computer (an old Celeron), linux desktop has matured enough for that, but it’s performance would be embarassing compared to Win2k… Maybe xfce 4.4? We’ll see.
xfce is faster than GNOME and KDE but its doesn’t contain many features that I want. This is one of the biggest reason that I hate xfce. There are very less applications available for xfce.
Poor performance of Linux interface is due to the way its implemented. Windows GUI is faster because its interface is implemented at kernel level or in kernel mode. This is some kind of technical talk that how the interfaces are implemented but both of them have there own advantages and disadvantages.
Same here. I had to upgrade my grandma’s computer and push the clock my own when moving to linux because from what I understand X.org suffers from horrible latency problems.
I haven’t upgraded to 6.9/7.0 yet, but hopefully some of this has been addressed.
Same here. I first installed (Red Hat) linux on my PII 333 back in ’99. After playing around and learning what interested me, I became increasingly annoyed at the unresponsiveness of GNOME or KDE on top of X. XCFE and WindowMaker were considerably more nimble, but felt awkward and uncomfortable for me. Eventually I swore off linux until the problems with X’s sluggish performance were solved, or until X is DEAD (perhaps preferably the latter.) Until I can have the low-latency that I crave in a full-featured GUI, linux will always remain somewhat of a sideshow to me. I wish it didn’t have to be this way, but it does.
Why is it RedHat always seems to take a “not invented here” approach to advances developed outside of RedHat? I thought the idea of Fedora was to open up the geek-flavour of RHL to permit more involvement with the community. All this appears to be is Yet Another Acceleration of X development. Could time be better spent on improving XGL, which (based on the notes in that thread) is more stable, applicable to more hardware, and a better implementation?
Or am I missing something?
“Why is it RedHat always seems to take a “not invented here” approach to advances developed outside of RedHat? I thought the idea of Fedora was to open up the geek-flavour of RHL to permit more involvement with the community”
This work started within Xorg team started long before Novell decided to take a different approach. So ask Novell instead of RH on this.
“All this appears to be is Yet Another Acceleration f X development. Could time be better spent on improving XGL, which (based on the notes in that thread) is more stable, applicable to more hardware, and a better implementation? ”
You can continue to believe that code that is already is Xorg cvs is not stable compared to Novell closed development which has undergo no peer review or integration with the primary Xorg project.
“Or am I missing something”
Yes. You are barking up the wrong tree. Neither Red Hat nor Mandrake or any other organisation is working on XGL and has gone for a different approach. Novell basically decided to do closed development and wants everyone now to accept that.
http://lwn.net/Articles/171155/
http://mail.gnome.org/archives/metacity-devel-list/2006-February/ms…
You can continue to believe that code that is already is Xorg cvs is not stable compared to Novell closed development which has undergo no peer review or integration with the primary Xorg project.
I will, because unlike you I have tried both the new Metacity and XGL and XGL is far better and more stable. Long time composite problems are fixed completely by XGL. If you want a list find one here:
http://linuxeyecandy.blogspot.com/
And deal with the fact that behind closed doors at Novell David did better than the rest of the community all put together.
dude. metacity is a window manager. XGL is a X server. You are comparing apples to oranges. Understand what you are testing or trying out atleast.
dude. metacity is a window manager. XGL is a X server. You are comparing apples to oranges. Understand what you are testing or trying out atleast.
You know that I was talking about Compiz.
Does not change the fact that the Metacity videos are primative compared to what eye candy I have working on my Ubuntu desktop today. As in right now. Not some magically point in the future.
Thats right, XGL works NOW, your help in the Ubuntu forum on Xgl has been outstanding.
Xgl is the only thing level with Vista and composite managers on there own dont cut it.
Manmist,
RE:“You are barking up the wrong tree. Neither Red Hat nor Mandrake or any other organisation is working on XGL and has gone for a different approach. Novell basically decided to do closed development and wants everyone now to accept that.”
So what you’re implying is that we should ignore Novell R & D resources which are structured and well financed and instead look to disorganization as being good for consumers and the Linux community as a whole? You seem to forget what Novell has done for the Linux community and consumers in general. Such as releasing projects like YAST and AppArmor under the GPL, fighting FUD from SCO and Microsoft, etc. What have companies such as Red Hat and Mandriva done in the last year that is so significant and wasn’t just an attempt to make money?
The way I see it is that Novell made a business decision that was in the best interest for consumers, not just their customers and used their resources to complete the project on time. Then they released the finalized project for other Linux developers to either use it or not. There’s nothing stopping Red Hat, etc from reviewing the code as it’s under the GPL. Let’s also realize the reality is that Microsoft temorarily stopped developement of WinFS for Windows Vista so they can push it out the door sooner rather than later to market globally (Q3/Q4 2006 instead of Q4 2007). Sometimes having a small trained group working on a project is better because it’s more focused than having a large group which tends to cause longer discussion, thus delays a project release.
Edited 2006-02-20 16:49
”
So what you’re implying is that we should ignore Novell R & D resources which are structured and well financed and instead look to disorganization as being good for consumers and the Linux community as a whole?”
There is obsolutely no need to do this within closed doors.
“You seem to forget what Novell has done for the Linux community and consumers in general. Such as releasing projects like YAST and AppArmor under the GPL, fighting FUD from SCO and Microsoft, etc”
SELinux is already upstream. Why is Novell selling security as a proprietary add on?. The user space solution for apprmor is still proprietary for your information.
“There’s nothing stopping Red Hat, etc from reviewing the code as it’s under the GPL.”
Its not. Its under the Xorg license. Atleast get the facts straight
:What have companies such as Red Hat and Mandriva done in the last year that is so significant and wasn’t just an attempt to make money?:
Who maintains glibc, gcc, large portions of the linux kernel.large amount of GNOME modules etc?
Manmist,
Regarding AppArmor: “SELinux is already upstream. Why is Novell selling security as a proprietary add on?. The user space solution for apprmor is still proprietary for your information.”
These links http://developer.novell.com/wiki/index.php/Apparmor_FAQ and http://www.novell.com/products/apparmor/overview.html will explain the product better to you. As for your “proprietary add on” comment I’ve already stated AppArmor is licensed under the Gnu Public License (GPL) and therefore has no licensing fees. Novell is not forcing anyone to use the security product. What’s good is by releasing the product under the GPL they are giving everyone access to another security tool in which to secure their networks.
Regarding XGL license: “Its not. Its under the Xorg license. Atleast get the facts straight”
Well then what you’re implying is that information online such as here http://en.wikipedia.org/wiki/XGL where it’s stated XGL was released open source 02/02/06 is incorrect.
So what you’re implying is that we should ignore Novell R & D resources which are structured and well financed and instead look to disorganization as being good for consumers and the Linux community as a whole? You seem to forget what Novell has done for the Linux community and consumers in general. Such as releasing projects like YAST and AppArmor under the GPL, fighting FUD from SCO and Microsoft, etc. What have companies such as Red Hat and Mandriva done in the last year that is so significant and wasn’t just an attempt to make money?
I think none of the posts said that, at least in this tone. Novell rightully got all the bling from this. No argue here.
Me personally, I’m all for the solution like this sooner than later, that’s why I use Novells XGL now. But, I won’t mind keeping the Novell-like bling after XOrg reorganizes and standardises it to something that can be called standard like “just works” solution. And having more solutions makes the possibility to pick the best one for basic default. Which is all that free software is about and it is how things worked in OSS from the dawn of time.
The way I see it is that Novell made a business decision that was in the best interest for consumers, not just their customers and used their resources to complete the project on time.
Yep, and this is where problems can start. You just have to look at it from distance. Read XDevConf papers and connect the presented technologies.
X needs a lot more global attention than just eye candy.
Let’s also realize the reality is that Microsoft temorarily stopped developement of WinFS for Windows Vista so they can push it out the door sooner rather than later to market globally (Q3/Q4 2006 instead of Q4 2007). Sometimes having a small trained group working on a project is better because it’s more focused than having a large group which tends to cause longer discussion, thus delays a project release.
Which works in MS case as charm. MS doesn’t need to worry about getting attention from HW vendors. Linux is not so lucky.
And they all have common working environment. How would this work in free software where people mostly live in different countries?
What you said is equal to Novell taking the whole Linux development on it self and boost production. It is just too big project for this to happen’.
What have companies such as Red Hat and Mandriva done in the last year that is so significant and wasn’t just an attempt to make money?
You *are* kidding right?
I’d suggest you check the kernel, gcc and gnome change logs.
Just type “gcc “at redhat dot com” site:gcc.gnu.org” in your google search bar.
Gilboa
First, not to belittle Novells compiz+XGL. I don’t. I’m still picking up my jaws from the floor ever since I saw it.
Why is it RedHat always seems to take a “not invented here” approach to advances developed outside of RedHat?
Because those two are different, read
http://freedesktop.org/~krh/aiglx.pdf
Another work from RH here:
http://people.freedesktop.org/~ajax/xdc2006.pdf
And fairly connected from the Sun
http://mediacast.sun.com/share/alanc/xdevconf06-fbpm.pdf
Again presentations from NVidia
http://download.nvidia.com/developer/presentations/2006/xdevconf/nv…
http://download.nvidia.com/developer/presentations/2006/xdevconf/co…
more here (hackfesting at XDC)
http://anholt.livejournal.com/
or at least the part where he says about AIGLX being non-implemented plan for 5 years already
This is correct approach (at least in my opinion) to this problem. Novell went for faster implementation to get the same result and RH went for more politicaly correct implementation where even Novells solution could still reside with not so many modifications but suddenly with a politicaly correct status and the same effect.
And if you read this papers you can see how nicely all works stack up ( PowerManagment(Sun) + X Deconstruction(RH) ) => Indirect GLX or NVida presentations => (Composite manager of your choice: Compiz, Metacity or some other) => hopefully what we all wish for
In the end result should be the same as now with Compiz+GLX, except everything should “Just work” instead of “Maybe if you’re lucky” like it is now
As for people saying XeGL being the right solution. Well, yes it is. But unfortunatelly just as much as saying NVidia driver is the right one for your card (yes, it works if your card is NVidia. Otherwise? Well, your X sucks major since it doesn’t work). XeGL depends on hardware that can drive the needed resources (in this case GL support, read NVidia paper for better info) while AIGLX overlays X and works without GL support too. You simply have to disable it, nothing else.
All this appears to be is Yet Another Acceleration of X development.
Yes, probably seems so. But after you study XDevConf papers, not really. At least I see it as politicaly correct solution.
Could time be better spent on improving XGL, which (based on the notes in that thread) is more stable, applicable to more hardware, and a better implementation?
Probably, if Novell wouldn’t choose to work closed session. But projects still differentiate quite a lot.
Novell went for faster implementation to get the same result and RH went for more politicaly correct implementation where even Novells solution could still reside with not so many modifications but suddenly with a politicaly correct status and the same effect.
Isn’t the community great? Political correctness and politics matter more than what is done. Until Redhat fixes huge problems with their framework (and add a LOT of stuff to Metacity) then XGL+ Compiz is the better solution. Its more stable for me than any other composite manager so far, and I REALLY try to find bugs. Who cares what makes the Gnome developers sleep better at night?
In the end result should be the same as now with Compiz+GLX, except everything should “Just work” instead of “Maybe if you’re lucky” like it is now
XGL + Compiz now works on far more cards than Redhat’s Metacity compositor does. It doesn’t have problems with video playing like the Metacity compositor (and all other compositors) does. If work was poured into XGL it would be ready by next year. A year’s worth of work on the Redhat framework is whats needed to catch them up to what XGL + Compiz can do today.
Of course, Gnome might go the Redhat route since its run by a bunch of no-fun traditionalists. Thank god for KDE4 which will take David’s work on the XGL and run with it. If Gnome goes the Redhat route than the tradition of Metacity being the more boring major window manager will continue!
Believe it or not – a actually _want_ a boring window manager. Really. I want my window manager to be as boring and unobstrusive as possible – because I want my work done.
I agree.Why Linux Desktop progress so slowly,I think the community is doing to much disgussion on the architecture rather than making things actually done.
Why Linux Desktop progress so slowly,I think the community is doing to much disgussion on the architecture rather than making things actually done
You are wrong. While users uselessly argue, developers are actually doing work.
But the fact is that right now, none of the solutions works really and can replace anything, unless you want to restrict yourself, and want to use developement code.
None of these can replace my current XOrg with multiple simultaneous sessions of Gnome, KDE, XFce. None of these can work reliably with the binary NVidia cards, and none allow me to look at accelerated (overlay) videos with NVidia cards. None of these compositors work with KDE either.
And still, people here argue about code that are not even stable yet. I mean, some people love to whine and argue.
My POV is that people loved to argue that Vista was years ahead (even though it’s not even out), forgetting Mac OS X, and now we have 2 big projects that just shows that FOSS, as always, never stopped going forward while people were disparaging it. Because now, we have at least 2 big projects that are very promising. That some people prefer one or another is irrelevant, you have the choice to install one, the other, both or none.
fragmentation. a completely separate opengl rendering server for linux, rather than novell and redhat working together.
whats next? both will work on an app-folder packaging format that will be incompatable with each other? maybe they should hack their own email clients so someone with suse can’t send mail to fedora users.
i haven’t seen this bling in action, so can’t comment on it, and i’m sure these people have worked hard – so it’s their superiors who are to blame for this.
This is called competition. And competition is one of the key factors in constant evolution. The more competition, the more evolution, the better the products will be.
You should be happy.
Why don’t they just work together? I keep hearing this and I have to wonder about all that talk.
Why don’t the Gnome and KDE developers just stop creating their own desktops and just go ahead and make the Knome project. Forget the different focuses and differences in thought.
Why do we need Gaim when we have Kopete? Or Koffice when OpenOffice is already there?
Who needs a choice? I guess I don’t.
One of the main differences that I see in Bling is that they seem to have font rendering in mind as well as eyecandy and compositing. They are using Pango to help composite and draw the fonts as well. Thinking of text not just 3D effects. As far as I know Compiz or Xgl doesn’t take this in account. If I am wrong please let me know.
JRM7
surely because kde have 65% linux desktop and gnome 20%
kde have application for all domain, it’s not true for gnome
(I’m a KDE user…)
Do you have a survey that backs your “65%/20%” statistics?
http://www.desktoplinux.com/articles/AT2127420238.html
use google to have more survey
OK.
I remember reading these numbers before.
However, this survey doesn’t really follow strict rules (It doesn’t slice the general Linux-using population… E.g. Enterprise users were unlikely to participate) and I doubt that the resulting numbers are valid.
Never the less, having no numbers of my own (and being a KDE user myself) I will concede…
Gilboa
search on web, you will find similar numbers…
also there a couple of month, novell said: majority of their customer use kde….
i pretty sure corporate who use mandrva use also kde…
same thing for linspire….
The desktoplinux.com survey is completely bogus, and not relevant to real world KDE/Gnome usage. As you can see on your link, there was vote tampering by the Yoper community. The Yoper votes were later deleted from the Distribution category, but that still leaves the rest of the survey severely skewed towards the KDE-centric Yoper.
As for online surveys, their value is questionable at best.
Or Koffice when OpenOffice is already there?
Because koffice was there before openoffice was opensourced…?
That’s really not the point now is it… according to the idea that we don’t need more choice, then OpenOffice, which has a code base much older than KOffice, should be the only one that needs furthering and the KOffice devels should just go ahead and contribute to the OO.org development. This is the way I feel about the people screaming “Why don’t they just work together?”
JRM7
Apparently, you don’t seem to be very aware of the situation in OO.o. The code there is almost a decade old and the code base is almost unmaintainable due to its large size.
Even though there are many groups contributing to the auxiliary projects (translations etc.) around OO.o, there are very few outside developers contributing code to OO.o and this is the reason.
KOffice has a much cleaner API, unlike OO.o and is much easier to hack on. OO.o will need to modularize its code base and eliminate even more outdated code before they can get community contributions flowing in.
Don’t want to sound to Zen but I think you see my point even if you missed it the first time around. I wasn’t saying that KOffice was worse off than OO.org. My whole point is that without the different projects we would be stuck with one or two things.
If we take the mentality that they should just work together and not have their separate projects, we are worse off.
Maybe I failed to insert enough sarcasm into my original statement…. I am too dry at times.
JRM7
One of the main differences that I see in Bling is that they seem to have font rendering in mind as well as eyecandy and compositing. They are using Pango to help composite and draw the fonts as well. Thinking of text not just 3D effects. As far as I know Compiz or Xgl doesn’t take this in account. If I am wrong please let me know.
Most of the font work is done on the Cairo end of things, and David (the creator of XGL) made glitz which is used to accerate Cairo.
So technically XGL will work smoothly what is already done. This is better than Redhat’s approach which is to make a new framework and talk about it a lot in hopes that others will extend it for them…
Well, actually it’s the other way round: AIGLX uses the existing pieces and implements accelerated 3D with them. (The problem lies within the drivers, which have to be adapted to this chances.)
GLX however requires a completly new xserver. In other ways; to throw away the complete Xorg codebase and start anew.
And you may know (or not): GLX still requires a host XServer. It is not a complete XServer itself. It’s a hack.
GLX however requires a completly new xserver. In other ways; to throw away the complete Xorg codebase and start anew.
Not true. XGL extends the old Xserver. Xegl replaces it.
Actually, both reuse the current XServer. And XGL isn’t a hack — it simply uses the current X server for GL because no standalone OpenGL stack is yet available on Linux. DRI-EGL will fix that, but it’ll take a while to get it working.
My two bits: AIGLX simply perpetuates the idea that any significant chanegs to X will cause the world to end. That’s quite a sad attitude, really. XGL is The Right Way (TM) to handle extending X11 into an 3D accelerated world. However, it also require major changes to the driver infrastructure which, frankly, are necessary anyway. Both Microsoft and Apple have decided to bite the bullet and overhaul the GL stack so the higher level software can do the right thing, and it’d be a shame if X cut corners in this regard. It also illustrates very well the “let’s extend EXA a little bit more” slippery slope that John Smirl warned about.
Edited 2006-02-20 23:29
It will never happen because we will fight over the name.
Should we call it Knome or Gknome .
seriously now….
Is it just me, or both GNOME and KDE are way faster than Windows GUI (KDE being even faster than GNOME)? I never understood these complaints about Linux GUI.
Athlon64 2800++ and 512 RAM
It’s just you. KDE and GNOME are considerably slower than 2K/XP on the same hardware.
I use both GNOME and XP and can’t tell the difference in speed, if there is any. The only difference I can notice is the lag between clicking on the applications menu in the panel and it actually showing the menu. Maybe it’s just you?
It is my opinion that much of X’s perceived slowness and much of Gnome’s perceived slowness is really Firefox and Thunderbird drawing performace on X. (I can’t speak for the Windows version since I don’t have a Windows machine.)
Firefox and Thunderbird are *very* popular applications and run on many people’s desktops.
Try this experiment:
1. Open up 2 Gnome apps.
2. Move one window around while overlapping the other and observe the performance.
3. Now open up a Gnome app and move it around on top of a Firefox session. Observe the major difference in performance.
4. Substitute Thunderbird in the same experiment, if you like.
My intent is not to criticize the Mozilla guys; I’m sure they have good reasons for doing things the way that they do.
Anyway, I imagine there is room for performance improvements in Gnome. (Can’t speak for KDE as I’m not as familiar with using it on a day to day basis.) I’ve been hearing very good things about Gnome 2.13/2.14 performance.
Anyway, I run a lot of my users on xdmcp Gnome desktop sessions into a central server at 100mbit. The hardware I use is usually either old or very inexpensive. For reasons of standardization, I leave the X driver set to vesa, which means no hardware acceleration. And interestingly (to me anyway) I have never gotten a complaint about redraw speed from anyone. It’s just not an issue for my users.
When I hear people complaining about X’s slowness, I sometimes wonder if it’s like people who trash transistor amplifiers because vacuum tube amplifiers are so superior. Maybe they are superior and maybe they aren’t, but how many non-audiophiles people are going to notice or care?
Edited 2006-02-20 19:50
Windows versions of Firefox and Thunderbird are faster. On Linux they are so slow.
On Windows I think they are using the native widgets.
On Linux the widgets are drawn by GTK, I think. It is just weird because other GTK apps are faster (e.g. epiphany).
I would like to know why they are so slow on Linux. Does anybody know?
Actually, all Mozilla apps use XUL as their toolkit. This is even true on windows. The reason why you don’t notice any substantial difference between the native widget and the XUL version is because of their use of the native theme APIs to render parts of the XUL widgets.
As for performance, Gecko 1.9 (the rendering engine used by Fx and Tb for HTML and XUL) is set to revamp the underlying graphics layer and use Cairo instead for all rendering. Gecko 1.9 will also feature reflow re-architecture which will speed up the rendering engine further.
If you are interested, you might want to test some of the experimental nightlies (at the moment only available for windows and linux)
bah for few days I thought turf wars was over…and that all linux distributions were working together to build a UNIQUE harware accelarated desktop which would leave Vista in the dust… well it was a nice dream.
(I compiled Xgl and compiz… awesome stuff)
who cares if redhat and novell work together as long as the end result is open source? How is this any different than other competing projects with different developer bases?
Personally I think Novell is doing the right thing and that EXA will die. People complain that XGL might not perform on older hardware. Well guess what? linux itself stops performing on older hardware as well. all new software stops performing on old hardware. get used to it.
Close your eyes… clear your mind… read the entire discussion again… doesn’t it sound silly?
Which one is right? which one is wrong? How the code design should be done, how you should or not try different ideas…
Well, in the end it’s all politics… everyone trying to convince the others how better their ideias are… Because, in the end, it’s how the world works. There’s no right or wrong, just different opinions and points of view. Let’s just try to repect everyone else and if possible live and work together. Maybe them these projects will also came to a common point (or ‘direction’ if you don’t like ‘points’).
Maybe we should just analize the advantages and disvantages of each one, instead of trying to blame the ones who didn’t agree in the begning. What each one can learn from the other (well, because they are both “open” right? there’s nothing but ego and a little work limiting the use of each one’s experience. There’s not intelectual property been warmed or any problem like that).
There’s no cruzades. We can’t salve to world because it doesn’t need to be salved.
We can just live and try to coexist the best way we can…
by an accelerated desktop. For that, at least with gnome, you need to look at the memory and speed improvements within gnome itself.
It is dissapointing that they haven’t gone with the XGL and Compiz stuff since it appears to work so well (well, it’s still very much beta but you can see where they are going).
As for the closed development approach, I feel that Novell has done a good job explaining this. While it would have been nice to have them at least say they were working on it (and to expect it), they were able to put out something that works and that the community can now make better. This as opposed to several community led efforts that resulted in half-done implementations.
Architecturally, it seems like this is the best solution. Out the door now, we have more bling than you, Xgl/compiz wins hands down. This is a massive duplication of work with Novell and Redhat not collaborating in development of a unified platform to make X “prettier”.
Natural selection happens every day in the OSS world. Which will die first?
I don’t think that there is duplication of effort here. Both projects have pros and cons.
XGL seems to be a better short term solution, it works with existing drivers (including proprietary, binary-only nvidia and ati), and supports must-have extensions like XVideo and accelerated OpenGL.
Architecturaly AIGLX might be better, but only time will tell if this outweighs its current lag in other areas.
Also, Novell seems to have won the backing of NVidia, which is a clear advantage not to be ignored.
What are you talking about XGL is a better short term solution. You think that 3d rending is going some where? Anything involving any kind of 2D rendering _anywhere_ on the desktop is a short term solution. If that isn’t obvious to you I think you need to look at OSX and Vista who are the competition.
You misread me.
Yes, XGL is a better short term solution to get 3D on the Linux desktop. I think it’s quite obvious that XGL will bring a fully functional 3D desktop to Linux sooner than AIGLX. Which doesn’t mean that in the longer term AIGLX won’t be a better solution.
I don’t think he misread you.
If AIGLX is better at any term at all, it should be in the short term, as it only patches in some 3D effects into a 2D environment, without disturbing much the current X code.
XGL(XEGL) is a more ambitious solution which aims to map all of the 2D environment over GL(EGL) and needs a much bigger overhaul of the X code.
Or that is the idea I have got out of that extremely indigestible acronym soup — X XORG XF86 XGL XEGL GLX GL EGL DRI DRM XAA EXA AIGLX Cairo Mesa XDamage XComposite XRender …
From the AIGLX wiki, this is not my understanding, AIGLX and XGL are said to be functionnaly equivalent.
When both are finished, they’ll do the same thing, but the path to get there and the resulting architecture will be different.
What do you think XGL can do that AIGLX can’t?
I might well be wrong, but I understand from the Fedora site that AIGLX’s purpose is to make it easier to “use GL effects”, while XGL tries to do accelerate all X graphics output via GL.
For instance, GTL+ talks to Cairo, and Cairo would talk to X via Render in AIGLX, with no 3D intervention, while it would directly output OpenGL calls via Glitz in XGL.
If this is true, both would allow funky accelerated window managers, but XGL would accelerate Gnome as well. Again, I might be wrong: this is all VERY confusing.
Hi,
what are these complains all about?
We have a problem to get solved:
* X is slow, pain in the ass slow, f–kING SLOW! (this is mostly due to all the data to be flown between client and server on every f–king window moving over another, menu popping up and so on)
* NOT using the GPU for graphical(!) output is a shame, yes a shame! And it is not possible with current X
We have two possible solutions:
* build something on top of something other which is a really evil way of doing it and hope to get further support from hardware vendors to eventually get rid of this stacking
* enhance/modify the existing framework which is clean but breaks compatibility for current drivers
Both solutions have their up- and their downsides. No one can know today which one will work out better.
Both are under open(!) development right now, both in Xorg CVS.
Now let’s see which one will work best! There is much shared between them which will help to get an uniform “best” solution.
This is not about taste (unlike KDE vs Gnome), so they won’t coexist for a long time. Calm down and feel lucky to not miss one option out on the solution to this really pending problem.
Regards,
Ford Prefect
”
These links http://developer.novell.com/wiki/index.php/Apparmor_FAQ and http://www.novell.com/products/apparmor/overview.html will explain the product better to you. As for your “proprietary add on” comment I’ve already stated AppArmor is licensed under the Gnu Public License (GPL) and therefore has no licensing fees”
Bull. User space tools are still proprietary. Only the kernel part has been GPL’ed. SELinux is already in the upstream kernel. What about Apparmour?.
”
Well then what you’re implying is that information online such as here http://en.wikipedia.org/wiki/XGL where it’s stated XGL was released open source 02/02/06 is incorrect.”
Incorrect. I am stating a fact that XGL is NOT under GPL but thunder a Xorg license. Looks like you dont understand licensing differences at all. Novell has done XGL under closed door while rest of the vendors working on Xorg cvs directly. Thats what Novell as a proprietary vendor is doing. Playing tricks
Novell has done XGL under closed door while rest of the vendors working on Xorg cvs directly. Thats what Novell as a proprietary vendor is doing. Playing tricks
Ahh…so that is why Novell’s work is now running on other distros (like Ubuntu) while only Fedora has this new Redhat stuff? Is that why Novell’s work didn’t enter the Freedesktop CVS ( oh wait, it did)?
Is that why Novell’s “tricks” are more stable than the composite work that came before it? Is that why Novell worked with ATI and Nvidia to make it so that the XGL would be fully supported by future driver releases while Redhat only has the open source drivers working (which means no high end hardware can use it)?
Is Novell’s “proprietary vendor” why Novell can show RIGHT NOW what DOES WORK in its compositor, while Redhat is mostly talking about “what can be done?”
Then screw it, I prefer the Novell approach. And so do thousands of others that have been messing around on the other forums (such as Gentoo’s and Ubuntu) to get XGL to work, while almost no forum has a thread to get the Metacity compositor to work.
Manmist, I work on the AppArmor product team in Novell’s SUSE Labs.
All AppArmor code is released under the GPL or LGPL: http://forge.novell.com/modules/xfcontent/downloads.php/apparmor/De…
This includes the logprof/genprof profile generation tools. This includes the YaST GUI front end. This includes the report generation tools. Everything.
What you may be thinking of is the AppArmor integrated into SUSE Linux 10.0 and SLES9 Service Pack 3; these were released before AppArmor tools were GPL’d, and were thus released under the older proprietary license. We plan on releasing an update for SLES9SP3 and SL10.0 in the future to relicense the files and packages, but CODE10 development has taken priority.
The AppArmor team also plans to submit the kernel module for inclusion in kernel.org kernels. However, six years of development without concerted cleanup efforts along the way requires time to “clean up” to he kernel’s high standards.
Edited 2006-02-21 19:50
The page says
“NVidia has told us that they will be adding support for GLX_EXT_texture_from_pixmap in the next release of their binary driver, which will enable the aiglx code to run.”
Yet GLX_EXT_texture_from_pixmap is already in 8178 nvidia driver, if it was not we wouldn’t have got compiz to work. My glxinfo says it’s there.
Edited 2006-02-20 18:28
This extension is not yet supported by any of the existing opengl drivers. It is not present in the 8178 Nvidia drivers. It is planned that following versions of the Nvidia drivers will have this extension.
Hopefully ATI will follow suit and quickly add support for this to their propietary drivers….
The reason the extension isn’t already available in the Nvidia drivers is due to the fact that the extension hasn’t been completely fleshed out yet-ie. there is still ongoing discussion on it’s final standardization-much of the problems being discussed were the subject of meetings at the last XDevConf-and some of those present have stated that there appears to be consensus on this extension and that some of these problems were resolved in these meetings…
Sorry my fault, I was looking at the wrong string in glxinfo.
Does this mean that older cards like my Geforce2 Pro will not be supported by neither XGL nor Aiglx?
Does this mean that older cards like my Geforce2 Pro will not be supported by neither XGL nor Aiglx?
By XGL probably not.
But AIGLX has a software fallout, where you can at least disable it or replace it with default RENDER acceleration of your card. Read NVidia XDevConf papers for that info.
This is where XGL and AIGLX structure differrentiate. One is limited and other tries to support as much as possible.
But in the end effects provided to you should be the same.
Does this mean that older cards like my Geforce2 Pro will not be supported by neither XGL nor Aiglx?
By XGL probably not.
Bull. There are already people on the Ubuntu forum that are getting XGL to work at some level with VERY old Nvidia hardware. While the wiki page for Redhat’s work says almost no drivers work.
This post should probably serve as better answer than I could ever make
http://www.0xdeadbeef.com/?p=178
And the fact they never said it is a final solution. They opened project after discussions on XDevConf taking to account all talks between NVidia, DRI, XOrg…
And where on earth did you get impression that I would be belittleing Novells work. It’s official. GLX IS A JAW-DROPPER.
I’m already running compiz and Xgl on my GeForce 2 Go with a Pentium 3 M. Trying to watch movies and shit like that is totally unusable (it skips a bit, totally unusable might be a little extreme), but the actual desktop affects like the wobbly windows are actually smooth… dragging windows around the desktop is actually smoother than before. I tried even loading up a TON of windows and using stuff like the expose and cube feature constantly. Occasionally, it will skip around when doing those effects, but for the most part, its still pretty smooth. The only issue is the nvidia driver in Xgl is a little bit buggy, I get some artifacts all around my desktop, especially with Firefox for some reason.
I’m not trying to troll here, but what’s the performance of these effects? Am I being paranoid by noticing that the minute display on the clock changes in all three of the the very “short” movies? Big coincidence or really slow effect sped up in the movie?
Hi,
here you can find some demos:
http://fedoraproject.org/wiki/RenderingProject/aiglx
Greetings
Mike
So last week we had a story about a marketing cooperation between KDE and Gnome. We have things like freedesktop and portland project and all this other supposed cooperation stuff, but let’s face facts. If there had been cooperation 9 years ago or so, desktop linux wouldn’t have all the troubles getting out of hobbyist status that it finds itself in today.
But the zealots will go into denial and say crap like “It’s the linux way” and everything is great and just take whatever crap they’re spoon fed from the developers.
Oh well, Microsoft and Apple just laugh at these antics.
Have you checked video card status for aixgl?
– nvidia: not working
– ati r300/400: not working
– i810: occasionally working
– i830-945: working
– other: not working
Seems we have great technology with no support for any hardware that might actually make use of it. Great not working for YOU technology.
On the other hand nasty-owful-made-by-novell xgl is already there working for anyone with hardware accelerated Xorg.
It’s nice that aixgl provides us with hardware accelration in more sane way than Xgl, but it’s useless at the moment. Nvidia support might be there in few weeks, but what about ati?
It’s nice that aixgl provides us with hardware accelration in more sane way than Xgl, but it’s useless at the moment. Nvidia support might be there in few weeks, but what about ati?
It helps if you actualy RTFA
They announced this project now. They never said it is usable already. Novell released closed doors project (as open and already in working state) and RH opened open doors project (but in the begining stages).
AIGLX is just a better (or at least politicaly more correct) replacement for XGL somewhere in the future, meanwhile we can all enjoy with the great work Novell has done. Isn’t that great. You can enjoy with tech even though it doesn’t officialy work yet, use second project at hand that works better (even though it is not so politicaly correct). In the end you won’t mind one tech being replaced by another if second was better.
The only difference is that now it is working for some people and then it should work for everybody.
Ati will follow the movements, don’t worry. Just as will follow all DRI drivers.
VGA compatible controller: ATI Technologies Inc Radeon R250 Lf [FireGL 9000] (rev 02) (prog-if 00 [VGA])
Works on my laptop.
VGA compatible controller: ATI Technologies Inc Radeon R250 Lf [FireGL 9000] (rev 02) (prog-if 00 [VGA])
Works on my laptop.
So you mean it doesn’t work on any card made in the past 5 years or so? Great.
Is there any way for us Windows-sissies to view those demo movies?
Just use VLC and stop whining.
Besides, why were you using anything else? VLC plays everything (from ogm to DVD) without installing codec packs.
Here it is in a nutshell:
Redhat- SAYS that the amazing effects Compiz provides COULD be done with their framework (but all they have to show for it is a Metacity that is primative in comparison). Says that the big problems with its framework today (video playing, drivers, etc.) COULD be fixed.
Novell- SHOWS what amazing effects Compiz+ XGL DOES provide today with their framework. Has SHOWN that some of the biggest problems with composite in the past (video playing, lack of being able to use the closed source ATI driver, etc) their framework can fix.
Its the difference between a possibility and a certainty. When Redhat actually SHOWS what their framework can do AT THE LEVEL of what Novell has SHOWN then I will sing a different tune. Till then Novell is pioneering the future of X while Redhat is making promises.
Some quotes from Christopher Blizzard’s post [1] :
“However, there’s been a huge number of external contributions to the aiglx work from outside of Red Hat and one of the primary components of aiglx (the pixmap to texture extension) actually comes from the XGL! It’s just where it’s integrated that’s important. At worst, it’s a competition, at best, it’s inadvertent teamwork.”
“Because that’s where the real value is in compiz – not the window management capabilities, but the great 3D effects it has. It would be great if we could get the best of both worlds and deliver a unified solution.”
Just take a look in the entire post. There’s some usefull info from this dev’s point of view.
[1] http://www.0xdeadbeef.com/?p=178
Was just going to quote from the same post to note that, as usual, the devs tell a different tale: not a RedHat vs. Novell all-out war, but a competition with cooperation typical of open source development.
Folks, can we stop seeing it either black or white? This is free software we’re talking about, the (comparative) waste of resources is more than compensated by the fact that choice is good, that different paths can be tested and tried, and that good code doesn’t get wasted anyway, as the example above shows.
Enough with silly flamewars.
rehdon
I built Xair with the instructions from the Fedora people . If you have an r300 chipset you won’t be using Xair for a bit longer.
I used xserver-HEAD and got further results, your mileage may vary.
Screenshots found on my website
http://www.sh0n.net/spstarr/xair/
Shawn.
my r200 can work, but a liitle slow actually, indeed the GL extension they use is not GLX_EXT_texture_from_pixmap,but GLX_EXT_texture_from_drawable
Since my system use nvidia, I cannot test that AIGLX. Hopefully the update will come ASAP.
The general idea behind AIGLX is more or less the same as Xgl is implemented,though Xgl uses a general xserver to manage x-renderings.But before we see AIGLX done,we should rely on Xgl,coz I don’t think open doors development will make a fast progress.The progress between open doors and closed doors development of Xgl proves this.