“After discussing this with several people I have decided to stop working on EGL and Xgl. The recent work on EXA is going to have the side effect of pushing out any hope for an Xgl release by a year or more. By extending the 2D drivers to accelerate composite end user demand for Xgl will also be reduced. I can not justify devoting further time to this project without reasonable hope of it reaching completion.”
Redhat? Novell? Nvidia? Ati? Microsoft? SCO? McDonalds?
That leave’s Dave Reveman working solo on one of IMO the most important Linux desktop projects out there.
XGL is horribly undermanned. I didn’t think even 2 people working on that infrastructure was good news, nevermind one unpaid volunteer ceasing work on it! This is not good.
Ditto Anon above, someone, corporation, foundation, help..
I’d program it myself (imagine the CV!!) but I can barely read and I can only count to 20.
I agree. Isn’t desktop acceleration more than just eyecandy… doesn’t it speed up the desktop because your using your graphics card to help render, which it was designed for… freeing up your cpu ?
… drivers could help.
For now, NVidia and ATI both provide OpenGL+GLX with their binary drivers ; they also provide XRender acceleration. But AFAIK, those can’t be used with X (they run “on top” of X, not what is needed for Xgl (Xegl/Xglx)
It is smart from Novell to ‘sponsor’ Xgl work (Dadid Reveman), i only hope some other vendor (say redhat) would also pay 1/2 people to work with DavidR.
Actual implementation of Xgl work with MesaSolo (to provide EGL) and r200 modified DRI driver (to provide OpenGL). It could help to have a IHV providing compatible drivers (ATI/NVidia/Intel have the big % of the market, with ATI/NVidia on high-end)
those can’t be used with X => withOUT X
Unfortunately 2D graphics (if accelerated) is more than good enough for coporate desktops so I dont think Red Hat and co would be interested. Only cairo apps will use opengl for acceleration and we already have glitz for that so I guess XGL is redundant at the moment unless you want to accelerate non-cairo or legacy apps as well.
speaking about glitz, it’s main author is David Reveman, working for Novell on glitz and Xgl (Xegl/Xglx)
So, this could impact glitz future too
AFAIK glitz is not dependent on XGL and is simply a layer above openGL so it should work on any X server that has mesa/DRI or vendor specific opengl drivers.
Glitz future will not be impacted by this. Cairo needs to be accelerated and so glitz is just too important in that respect. I would suspect Reveman is probably spending more time getting glitz to work rather than XGL as Cairo is now imminent.
>AFAIK glitz is not dependent on XGL
true, it’s the other way around : XGL depends on glitz
now i wonder what and how much Qt4 accellerates or allows to accellerate with hardware… does it need xgl to do the reall impressive stuff? as Qt pays a developer to work on hardware accelleration in Xorg, i guess they can at least use his work…
This is bad news. The 2D Architecture will enhance X dramatically, but it’s really a bandade. It’s amazing that XGL has only one sponcer when you consider what Apple has done with their desktop, using the same basic concept with Quartz. He was right to go public with his lack of help, so now the waiting game will begin to discover who will step up to the plate. Now in the KDE blogs, several of the Troltech developers are supposed to start pounding on XGL as soon as the libraries for KDE4 have been ported to QT4. XGL is the least talked about, but most anticpated software development for Linux.
I am not sure if a shift in focus away from a OpenGL based XServer like XGL is necessarily a bad thing. Unfortunately there are no open source 3D drivers for most modern graphic cards, so a OpenGL based XServer means that we have to rely solely on proprietary 3D drivers. I always had problems with the proprietary drivers of my ATI card and many other people reported the same and I really don’t want to totally rely on them. If a good 2D acceleration actually provides most of the desktop experience that MacOSX and Vista offer and could actually be implemented with the open source drivers, I prefer that to XGL. Another advantage is, that you don’t need the newest hardware to have a good-looking desktop, which I see as an advantage, too.
I’m sorry, but I fail to see your statement, it is true, old hardware should be supported more. But the ones who have bought the new hardware, with decent openGL drivers, should they be left in the dark? It’s a part of the future, and it certainly is a necesity to develop it. There’s always the option not to use it. And that option should be used and developed further, but an openGL based xserver probably would help make nvidia and ATI (and intel and …) launch better openGL drivers as well (and nvidia already has splendid drivers). Too bad I’m not a programmer at the time , otherwise I’d help. Hopefully trolltech will start to help working on it soon, are anyone else.
And has never been.
Implementing hardware acceleration at that level is wrong for several reasons. Doing it at composite level provides a much more flexible solution, and you aren’t locked in on the OpenGL API.
The modular X.org server with EXA is the way forward.
Cairo with glitz is better than GL-accelerated Xlib, because the Xlib API is over 20 years old, and today’s rendering requirements are much better met by Cairo (and Qt4’s Arthur, which can also use a GL backend). Xlib is becoming less important, and so the work should be done at the level where it will have any significance.
– Simon
Simon, you are confused. Xgl has Cairo/glitz at it’s core. Look at the picture in the wiki: http://www.freedesktop.org/wiki/Xegl
Currently Cairo/glitz runs on DRI. The current DRI is built on top of that 20 year old X code.
Xgl turned things upside down. The modern code in DRI (EGL) became the base. Cairo/glitz then ran on top of that. Xlib was pushed to the side as a legacy API.
EXA is an extension to the that 20 year old X code. With EXA the old code will be with us far into the future. Glitz does not use EXA, EXA makes xlib apps run faster.
This is not as bad as it sounds. Think about it, EXA is good and dosn’t need 3d drives. GLX needs good OpenGL drives and we just don’t have them yet. The free drivers are underpowerd do to age or missing information and closed source drivers are not going to be emidiatly available for x.org once GLX would be done. x.org is a work in progress and GLX dosn’t make much sense till there after 7.0 IMHO.
We need XGL this is a bad thing. If this is true and more people don’t start working on this project then Apple and M$ will have hardware accellerated desktops YEARS ahead of Xorg.
Linus said it right, the point is not to compete with the Jonses but to build a beter product for us. What Apple and MS do should be irelevant to when and how we do our work. OpenGL acceserated X dosn’t make much sense before x.org 7.0, the EGL and Xgl work needs migration out of the monolithic x.org and lots of the work needed for EGL and Xgl are necesary for modular x.org. So even if nobody tuches EGL and Xgl directly for some time, lots of the ground work will still be done since it is needed for x.org as a hole.
X has lots of problems that need fixing, don’t fixate on your one pet project from under 30. They are interconected in lots of ways, and we will get there, but doing the work of 20 years is going to take its time.
>OpenGL acceserated X dosn’t make much sense
>before x.org 7.0, the EGL and Xgl work needs
>migration out of the monolithic x.org and lots
>of the work needed for EGL and Xgl are necesary
>for modular x.org.
well, bad news, 7.0 is in late RC and final would be out in september, yes, next month.
modular X (7.0) is necessary for Xgl to be usable as a replacement, not the other way around. (modular doesn’t need EGL/xgl at all)
Actually, we’ve got :
XAA : really old 2D ‘API’ to accelerate XRender (badly)
KAA : an attempt in KDrive to improve XAA
EXA : the new 2D ‘API’ to accelerate XRender (based on KAA work)
DRI : hooks to have accelerated MESA OpenGL+GLX
binary driver : provides XRender and OpenGL+GLX directly
what a beautifull mess
You forgot kernel framebuffer, fbdev.
Why not at first base Xorg & co on KGI ( http://kgi-wip.sourceforge.net/ ).
Driver at practically kernel level is good things i believe…
interresting, i thought GGI was entirely dead years ago
reading KGI site, i see that i doesn’t work with traditionnal X either, and is mostly used for HW accelerated console (really basic functionnality)
Where does DirectFB fit into all this?
As far as I can tell, it has acceleration and eye-candy galore, has an X11 layer for compatability and is working now.
Not many cards are supported, but possibly more open GL support than X11 with free drivers.
What kind of effect, if any, does this have on projects like Luminocity?
Luminocity does not use XGL. It is based on the composite extension in x.org
so, let us sum it up, here is what exists for graphic card drivers in Linux (and perhaps other Unix) world :
1 fbdev/DirectFB/GGI+KGI : mainly frame buffer based (2D)
2 XAA/KAA/EXA : low level for XRender acceleration (2D)
3 DRI+DRM+Mesa : low level for OpenGL+GLX acceleration (3D)
4 XRender : (2D)
5 OpenGL+GLX : (3D)
Devs from the community battles around 1, 2, 3 and 4
Binary drivers provided by IHV gives us : 4 and 5
Cairo (2D API of choice for the future) can run on top of 4 (directly) and 5 (via glitz)
Welcome to driver model HELL
There will always be a new library and someone that thinks there is a better way to do things. However, in the end I really don’t care which path is chosen because they all end at two stumbling blocks: 1) Closed source Drivers for speedy 3D rendering and Acceleration (Open drivers will never get the most out of that 200 graphics card you just bought), and 2)A Unified implementation across all platforms for 2D acceleration. In short, get everyone in a room and get it done, because to really compete with the next gen versions of OS X and Vista, any desktop better be cocked and loaded with eye candy. The old, this is a better and more secure OS isn’t going fly much longer outside of the business world.
First of all, be sure, there will be eye candy in future releases. And there will be 3D accelerated Xgl somewhere in future, and there will be combined efforts into one.
Althought, when I have to answer the question – do we need all that stuff now? My pick is – no. I don’t believe in rushed features, blowing markets, etc. OS X is nice and all that sutff, but it still is OS X – totally different horse here for user.
I think everyone should calm down.
I was quite eagerly waiting for Xgl myself, though I don’t know much about it. Why? Well, as said, cairo runs on top of normal X, and doesn’t necessarily use glitz. glitz would accelerate cairo, but _only_ with recent hardware. (You can check it out for yourselves, just seek for their pdf presentation) So, I checked it out and noticed that although my laptop has fully functioning GLX, it just doesn’t support most of those things glitz would utilize. Thus, glitz is rendered unnecessary. Also cairo/glitz will only accelerate programs designed to use them, not every application. Atleast unless it’s directly integrated to xlib..Xgl on the other hand would have accelerated everything, and would have gotten rid of alot of unnecessarily complex things in X. And, Xgl wasn’t supposed to run just on the recent hardware. That said, I would want to help develop Xgl myself, but I don’t think I would be much of help since I know only basic C/C++. In any case, I’m keeping my hopes up.
-WereCat
You’re misunderstanding something. Xgl would still require a graphics driver which can hardware accelerate OpenGL. If your current X server doesn’t have such a driver then I doubt Xgl will magically have one.
No, I am not misunderstanding anything. As I already said, I do have fully working OpenGL enabled, using the latest DRI open-source drivers. It’s the capabilities of the card I was talking about. Glitz just so happens that even though they _could_ have done a few things so they’d work even with older OpenGL hardware, they chose to rather use things which only work on GeForce FX or newer. So that’s why it renders Glitz useless on both of my machines.
Uhh, and as a side note: where did I say I don’t have hardware accelerated OpenGL?
-WereCat
I think XGL will again pick up sponsors and pick up speed, this is not a death sentence, and I will help out any way I can.
Come on, some one organise some means to gather money from around the world to drive some of these projects. One dollar, less than a packet of fries, if 1 million linux users out of say 5-10 million? could throw in one dollar a year, just think what the linux community could achieve, One measly stinking dollar – Count me in for a 100!
I think it wouldn’t be a bad idea either. I guess I could give some money too if it was only used for developing Xgl and anything it requires. We’d just need to find out if there’s enough people wanting to share something. 3 – 5 people giving a few bucks just isn’t worth the effort. Perhaps we should put up a web site and see if there’s any interest?
-WereCat
I had a lot of hopes in this project, as current state of framebuffer/dri/X on linux is a total mess, with the overhaul on fb to work with dri we can not only have opengl outside of X but also some dualheads and double video outputs (laptops) will finally work properly.
If anybody has ever tried to get a laptop with Linux using FB (fbsplash for example) and X to display in a projector will know what i mean, just look at gaudec presentations, those Thinkpads are supoused to work great with Linux, but no, everybody from Novell had problems with the projector look at Robert Love or Federico Mena presentation.
The problem is the lack of coordination between kernel framebuffer device, DRI drivers and X drivers, they all overlap!, that’s why i find really shity that nobody is taking on this project, I think Xgl is to be the next generation of Graphics on Linux, X architecture works but it needs an overhaul not a ton of hacks here and there, they will never play nice with fb/dri.
Xgl is more like MacOSX Graphics than X.org, just because of the OpenGL acceleration.
– Carlos Daniel Ruvalcaba Valenzuela
Fixing is bad if so many fixes are need to be done. I think that it should be rewritten completly (like Cray had done with construction of his next computers). Why Apple didn’t fix its MacOS? Why Vista will not have fixed GDI? Because it makes bad code and problems for the future.
UNIX style is 20 years olds. I use unix everyday.. just somethings are more cryptic than coding. My Biggest complaint about unix is that there are so many variants and no “one good” path to develope for it, that I find myself using windows. Windows is no angel, but neither is unix.
OS Design needs to change to the way of managed code. This will help fight viruses, worms, spyware. Kernels need to be at least partly written in managed code, along with drivers. We need to get to a point where we can load more than one application in a process space and have very little context switching.
Drivers being managed code will provide less kernel panics, and more speed. WIndows and Unix need to go, the future is a kernel managed code.
First: I think it is very sad that Xgl has no priority for a lots of people.
Second: I think it is a good idea that the user start a project for better graphic support in linux and collect money for it. Burt this leads me to the third point:
Third: There should be a discussion about what we want! Do we want to have a mess of drivers and applications for graphic support in kernel and userspace, OR do we want to have one single graphic api/implementation.
I think we should start a project which replaces fb/dri/X and on the other hand solve the problems with proprietary drivers.
Let’s go for it
awi1817 (at) hotmail.com
With it’s new accelerated GUI, Windows Vista will be blazing fast, just like OS X do. Where’s Linux??? Still using hacked/patched code designed years ago. Here what we need to do.
1st – Ask everybody working on graphic stuff like HW Vendors(ATI, NVIDIA, XGI, Matrox, Intel), X implementer (X.Org), Kernel developpers (DRI architecture), Widget designer (QT, GTK, Wx) and ask them to form a board to completely redesign graphic support.
2nd – Push toward a complete integration, meaning a basic SVGA graphic driver included in the kernel used until the right graphic driver is loaded, then any graphic API can be setup upon this, like X or fbdev. And all the time complete acceleration.
3rd – Reduce as much X or FBDev layer overhaul: I’ve always wondered why there was a difference between console and X, in a graphic mean. It’s like, text mode is death, why not simply be in graphic mode.
We need everybody to talk at the same table, or else we ‘ll be at the same place in 10 years. Image using X in native OpenGL… Like having a Superkamabara 3D widget moving and being transluant, that would be a kickass desktop!
>It’s like, text mode is death, why not simply be in graphic mode.
what a joke … never heard about servers and head less machines ?
console is enought. running X on production servers is like running a remote GUI, useless.
SSH and screen == console should stay.
For the 3D GUI, let it for films and geeks. You don’t need it for enterprise productivity and you want your 3D software (CAD or games) to be the only user of your 3D HW at a given time.
———-
What we need first is real HW accelerated 2D.
Using OpenGL is one way.
What we need first is real HW accelerated 2D.
The thing is that 2d is a subset of 3d. Nvidia and ATI are moving away from providing more 2d acceleration and instead putting their transistor budget into 3d.
Of course, if their specs were open you wouldn’t even need opengl to provide the 2d acceleration that 3d acceleration provides.
Like the subject, although somewhat offtopic (apologies). I think moving the graphics drivers to the kernel would help move X11 forward. KGI would maintain the drivers across the supported platforms, also allowing third party propriety drivers to continue to exist.
With the new module desigen X.org could focus on it’s key components/protocols and leave a KGI library or LibGGI (see XGGI) drawing module do the work. Having a KGI target would allow hardware acceleration to be handled externally.
There has been some brief talk on writing a LibKGI to port X11 toolkits (Qt) to KGI which could help in getting the X11 protocol running on a kernel graphics architecture.
moving the graphic driver to the kernel is a bad thing, here is why :
X11 is not Linux only. Linux, *BSD, Solaris, Irix, whatever-Unix driver models aren’t compatible.
So, only the OS specific part should go in the kernel.
+ of course the minimal : to support console and mode setting
DRM and fbdev (both kernel space) are going to fusion into only 1 driver (with IFDEFs to disable DRM for embedded users)
moving the graphic driver to the kernel is a bad thing, here is why :
X11 is not Linux only. Linux, *BSD, Solaris, Irix, whatever-Unix driver models aren’t compatible.
Okay I didn’t anything about X11 being only Linux. I myself use FreeBSD, and have used KGI and X11 on FreeBSD.
So, only the OS specific part should go in the kernel.
+ of course the minimal : to support console and mode setting
That is basiscly what the KGI project is trying to provide while also maintaining security and stability. Currently KGI-0.9 is running on older releases of FreeBSD-5 and Linux. Whatever the operating system, it implements the KGI Application Programing Interface(API), and then video vendors design and produce kernel modules using the KGI API for there hardware.
Unfortunately the KGI project lacks developers…..you know how it goes.
The very first thing I think is to unify the interface to graphic hardware,right now,there are too many ways to access graphic hardware directly,that would be a very bad idea.Kernel should be in charge of all hardware access.
Anonymous (IP: 218.19.239.—):
The very first thing I think is to unify the interface to graphic hardware,right now,there are too many ways to access graphic hardware directly,that would be a very bad idea.Kernel should be in charge of all hardware access.
That is what the KGI project is designed on. Video vendors would only have to compile an object module for the KGI system and then any platform vendor can link that module to there KGI system.