Open source 3D graphics drivers for ATI R600 garphics cards has been submitted to the kernel-next tree for possible inclusion in the Linux kernel 2.6.32. “David Airlie has pushed a horde of new code into his drm-next Git tree, which is what will get pulled into the Linux 2.6.32 kernel once the merge window is open. Most prominently, this new DRM code brings support for kernel mode-setting with R600 class hardware as well as 3D support.”
The announcement regarding this pull was made on David Airlie’s blog. The ATI R600 series graphics cards will likely become the most powerful 3D graphics cards with an available open source driver to date. The graphics processing unit codenamed R600 is the foundation of the Radeon HD 2000/3000 series and the FireGL 2007 series video cards developed by ATI Technologies. This announcement does not appear to include support for the more recent R700. R500 and earlier GPUs already have reverse-engineered open source 3D drivers for Linux.
Kernel mode setting means moving the mode setting functionality from video drivers from the userspace X drivers into the kernel. While this may seem arbitrary and uninteresting for end-users, this isn’t exactly the case. Kernel mode setting enables a much richer boot experience, less problems with sleep/wake cycles, and improved VT switching.
Kernel mode setting requires drivers to be altered to support it, and currently a number of open source drivers have been altered to support it. Proprietary drivers do not support it.
The R6xx driver also covers the newer R7xx cards, so they are also supported.
Some things are still buggy (some screen corruption bugs, not optimised for speed yet), but hopefully by the release of Fedora 12 (and maybe even Ubuntu 9.10 if they also use the patches), so until the next generation of graphics cards from AMD, all of them should be supported to some degree.
Once this is stable, the major laggard will be nVidia. I wonder if they will decide to take part, or even if these devs can help reverse engineer the drivers for nVidia.
PS from what I have read, most of the drivers were written from using provided documentation, and not reverse engineered (some of the r300 may have been RE though), so Kudos to AMD too for playing ball.
Edited 2009-09-09 22:40 UTC
The new code (just being released now) for R600 (and R700 apparently) has been written using provided documentation from ATI. Kudos indeed.
The earlier drivers, for R500 and older GPUs, were reverse-engineered. These are for an older architecture of GPU, and those cards are no longer supported by ATI’s proprietary driver.
ATI released documentation for R500’s too
http://www.phoronix.com/scan.php?page=article&item=amd_tcore_releas…
Correct. And they did so in Feb 2008, almost a full year before they released the R6xx/R7xx documentation.
There was a pre-existing (reverse-engineered) open source driver that already achieved basic 3D support for R500 and earlier. I think some effort went into improving that after the R500 documentation was released by ATI.
However, about a year later (i.e. earlier this year), ATI released the R6xx/R7xx documentation. The architecture was significantly different.
AFAIK this required a re-write of the driver. I think the project was even called radeon-rewrite (Google for it).
http://www.phoronix.com/scan.php?page=news_item&px=NzA2MA
AFAIK this new open source driver slated for inclusion in Linux kernel 2.6.32 more-or-less represents the outcome of radeon-rewrite merged with other radeon driver codebases.
Edited 2009-09-11 04:23 UTC
Ok, let’s clear up some of the confusion here:
ATI actually did release some specs (i think it was under NDA only) way back in the R200 days (Radeon 8xxx). That card was actually pretty well supported.
When the 9700 came out, there were no more specs, and the architecture was different enough that a new r300 driver was created. Some of the code there came from the old r200 driver, and the rest had to be reverse engineered. Later, r400 and r500 support were added by building on top of the r300 driver with reverse engineered knowledge. Good r500 support in particular didn’t come about for a long time.
AMD did eventually drop some r500 docs as you noted, although it was much less than what we got for the new cards. The developers did use it to quickly finish up the half-done state of the existing r500 support, and it now works pretty well. I think AMD has hinted at providing more docs for other old cards as well, but it’s very low priority and probably won’t happen until the current cards are working well.
Meanwhile, r600/700 came out with a completely new architecture and the driver was started from scratch. This is the driver that is just now becoming ready for use, and although it’s still rough it’s working remarkably well.
The radeon rewrite project wasn’t a seperate driver itself, but a port of the existing codebases to take advantage of the new KMS, kernel memory manager (TTM), DRI2 system. The rewrite added wrappers so that the same codebase could support both the new DRI2 features and the old classic DRI environment. As a result, the new code that is landing now supports both, just like the r300-r500 driver does as well. This is the change that screwed up the Intel drivers so badly, but radeon seems to have done much better, most likely because they didn’t release it immediately when it still sucked and because Intel worked out a bunch of the kinks for them.
The current code only supports OpenGL1.4, although 1.5 support should come soon (already in the r300-500 driver), as well as Xv video acceleration and is quite snappy for desktop use. Things like Google Earth will run great in this driver. Things like the newest game won’t (although i think you’re pretty silly for trying to run those in linux anyway).
So OpenGL3 / GLSL and fast 3D support will only come with the Gallium drivers, which are still a ways off. I don’t think anyone honestly knows when they will be ready. I’ve heard that the r300-r500 gallium driver is in pretty decent shape and might come out within a few months at least for an alpha type release. I’m thinking we could have a decent 3D driver for the newer cards by xmas 2010, but going that far ahead there’s just no way to know for certain what will happen.
Edited 2009-09-11 05:04 UTC
How about KDE4/Compiz desktop graphics compositing (i.e. desktop bling, and the desktop “spinning cube”)?
AFAIK, DRI (or perhaps DRI2) and OpenGL 1.4 support should be sufficient for this purpose. Is that correct?
There are a number of simple games and video players et al, that will run fine, and benefit immensely, from just this level of support.
3D games and scene graph rendering (e.g. for simulators) can come along a bit later.
Yes, compositing (via KDE or compiz) works with the current Mesa r100 through r500 driver, and even the (still experimental) r600/r700 driver, though with some remaining issues.
Hopefully this will help lead to nice, solid ATI/AMD video drivers. Personally, I’m getting sick of buggy drivers (both for ATI/AMD and nvidia cards). I long for the day that I’m able to buy a video card, plug it in, and have it simply work.
I am eagerly awaiting for this batch of drivers. I am currently running the open source drivers because ATI’s proprietary ones cause hard locks and all kinds of buggy behavior.
Here’s to hoping that these drivers finally deliver decent and stable 3-d acceleration in kwin and compiz.
I have been getting away from Nvidia and purchasing ATI cards for all of the computers I built because I bought into ATI’s promise of delivering enough documentation for real open source drivers to emerge and it looks like it is finally materializing.
Intel’s current line of cards are simply too weak and the drivers for the most recent ones are not completely open. This provides ATI with a great opportunity to really shine. I just wished that they would focus all internal development on one true open source driver and renounce their proprietary fglrx crap.
Kernel Mode Setting is important because having that level of support in the kernel apparently means that the rest of the X graphics stack can be written to run in userland (i.e. it can run as a usermode program, it won’t require root priveleges any more). This is a great improvement from a security perspective.
In turn this may also possibly allow X and/or the driver to crash, and be re-started, without crashing any application program.
Typically, it is only X that crashes. And when X crashes, it will always take your whole X session with it. And programs connected to the X server from elsewhere will happily die with SIGPIPE or whatever as they lose their connection.
Apart from that, yes, this is a great and important step for the Linux based desktop.
Ah, SIGPIPE, bane of my job. My boss likes to solve all problems by popening onto external applications (from compiled C code)… which means that I get to spend a lot of time worrying about catching and handling SIGPIPE.
That and, if the graphics server or graphics driver dies, I think it’s reasonable for the whole session and all running apps to go down. It’s certainly not surprising. At least for me, the problem is that X goes down a lot, and isn’t really stable, high-performance or bug- and glitch-free while it’s up!
Edited 2009-09-10 22:48 UTC
Until ATI/AMD commits to going down this development route for all of their chipsets then we’re still going to get a disconnected mess with fglrx – which simply shouldn’t exist in my view. The trade secret NDA card has been played for years as an excuse as to why a driver isn’t open sourced and in the kernel, it has shown to be false and AMD, and especially Intel, are in the process of showing that it is bogus for graphics.
We always seem to be saying “Oh, it’s just around the corner” with graphics support on Linux – permanently. We’ve still got lots of drivers doing their own thing and even reimplementing a ton of the Xorg stack (yes, you nvidia) and still a load of differences between what a device supports and what a driver can actually do with it. If only graphics drivers were cajoled into being open sourced and using a shared codebase like those in the kernel.
I’m sorry … what exactly will we need fglrx for?
For the next kernel, 2.6.32 or later, only nvidia cards will still require a binary blob driver for 3D hardware accelerated graphics and compositing.
There is a reverse-engineering project (nouveau?) to write a driver for even nvidia cards, but AFAIK it isn’t ready yet for 3D acceleration or KMS.
http://www.osnews.com/story/21033/Nouveau_Becomes_Default_Driver_in…
Edited 2009-09-10 02:56 UTC
Excluding the open source argument; I wouldn’t use Nvidia simply their products are poor quality and have been so for many years. The only people who don’t seem to care about stability and quality are ricers and gamers who seem to change their hardware configurations more times than they change their undies.
This is one of the reasons I have resisted getting a new MacBook – I don’t want Nvidia in my laptop or desktop; they’ve screwed the pooch far too many times with customers that have MacBook Pro loaded with 8400 GPU’s still experiencing GPU failures with some having had their boards replaced 4 times.
Quite frankly, I’d sooner go for a ATI powered laptop running Windows than having a MacBook with an Nvidia chipset from Apple. Yes, I loath Nvidia that much.
That is all fine from a personal point of view. I too bought my last hardware with ATI graphics specifically because ATI published the specifications for open source developers. Excellent. Kudos to AMD/ATI.
However, having said that, it is still important that open source doesn’t abandon those people who have nvidia hardware, IMO.
For that reason, even though I wouldn’t get nvidia hardware myself, I still applaud the efforts of the Nouveau project.
Apparently they have Xrender hardware acceleration working (so KDE4 should be good to go), and they have made strides towards (but still have some way to go yet) for KMS, Gallium 3D support, 3D support in general, video support etc, etc.
Still, it works well enough for desktop use such that Fedora have been able to adopt it for the default desktop. Even if you don’t like nvidia, this is still a good thing for users.
But the problem is with the Nouveau is the impression I get is the same I get from Wine. It sounds very nice to do it for compatibility reasons but it can be a double edged sword. Through the continued development of Nouveau Nvidia can easily keep the status quo and claim they don’t have to cooperate because the OSS world is doing fine and dandy.
Like I said, Nouveau is a double edged sword.
Then again, I question how many end users have Nvidia GPU’s given that most of the time I come across people with Intel X3100 or X4500 in their laptops.
Mind you, I might be proven wrong and because of the additional infrastructure put in place within Linux that more companies are willing to open up specifications.
My choice was based on the 8800 GPU being the best value at my time of upgrade. Sadly, the 9600 and later GPU seem to be tweaked 8800 boards until you hit the 260+ GPU chips. Nvidia’s drivers have been good though and if Nvidia can keep competitive with the open source development rate then I’m ok with that. If they drop support for the 8800 boards without providing the community driver project specs to keep going; I’ll have an issue.
I also hear that man of the Nvidia developers also work on the community driver in there spare time. It’d be nice if the company officially backed the project but having the same devs on both drivers is still a benefit.
Now, when my next GPU upgrade comes around, I’ll be reconsidering ATI, Nvidia and any competitive boards but it’ll still be who has the best performance and support across platforms. Maybe it’ll be ATI by then, this is the first GPU I’ve purchased that wasn’t from them but the flakey as crap drivers and addon apps under both Windows and Linux did me in with my last AIW board. We’ll see how AMD’s new open policy does before I give them my money again.
I bought an ATI-card because they released specs, all I got was a barly working and slow card, what I quickly replaced with a nvidia card.
Maybe the linux kernel 2.6.32 will give me a card, I can actually use.
Actually quality was the reason why i dropped ati and replaced it with an nvidia card. The ATI drivers sucked majorly for Linux while the Windows drivers were working excellently.
Getting compiz up and running without locking X was a trial and error test (which stuff did not lock up X could be enabled the other one had to be disabled)
The revision before even crashed X on video window resize.
You can tell me many things about nvidia, but with the card I just had to enable the binary drivers and suddenly everything worked flawlessly, no X crashes anymore.
ATI always has been like that, good hardware really shoddy drivers, but at least under Windows they finally have gotten their act together driverwise, the Linux land is business as usual. And btw. where are the BSD drivers?
This is exactly why the fact that there are now open source drivers (coming soon in the mainline Linux kernel) is so important.
We now have the documentation for how to drive the graphics GPUs, and we have open source code to drive them. Having both of those also means that when new bugs are discovered, they can be fixed. This is now true for very capable, competitive graphics GPU hardware (since ATI hardware outperforms Intel hardware). These drivers and graphics cards will quickly become the top line for performance, stability and supportability on Linux.
It will no longer be possible for an OEM to hinder Linux (unintentionally or not) by providing sub-standard binary graphics drivers.
Edited 2009-09-10 10:53 UTC
A few years ago ATI was nothing but trouble for me and some of my friends in *Windows*. In Linux, it was much worse. I haven’t tried ATI lately, but now that they have OSS drivers, I do hope that the quality of the hardware and drivers has gone up.
I was toying with windows 7 on my old laptop (2GHz P4) which came with Win XP. No sound, no graphic accelaration
cause the radeon mobilty M6 seems no longer supported by ATI.
But with Ubuntu 9.04 everything works (even the netgear wifi). Cutting edge hardware will always be trouble and this laptop certainly was back then.
On my more recent desktop win 7 is behaving a bit better.
Yeah, I had the same experience. I had an ATI card on an old desktop and basically considered it to be “VESA-only” under Linux. Fglrx was awful and the open-source 2D driver barely worked.
Well, you wouldn’t believe the improvements in the last couple of years. I have an HD3200 IGP (R700), an R400-series 128gb PCIe card, and a laptop R500-series IGP, and they all run *flawlessly* today under Ubuntu 9.04, including compositing and 2D acceleration. Until a few months ago, I had to use the awful Fglrx proprietary driver for the R700 IGP, but now that AMD/ATI have opened up the documentation progress has been astonishingly swift.
My hat’s off to AMD/ATI for doing the Right Thing. They’ll get better drivers and loyal users. I’m not buying or recommending NVidia graphics to anyone unless and until they open up as well.
Sorry but you only have to blame Apple for it.
NVidia only produces drivers for desktop systems based on their reference design boards.
The drivers coming with your Mac are partially developed by Apple. One are where I was disappointed with Snow Leopard is that it still lives in OpenGL 2.1 world.
Macs are good for many things, but not for Graphics workstations.
Comparing ATI and NVidia, the later gives way much better support to developers making use of their products. Just look to the amount of tools and developer documentation that each vendor is providing.
What I find positive is that ATI is providing GLSL support on their tools, while NVidia only provides Cg and HLSL.
OpenGL 3.2 drivers are in beta for Nvidia and AMD. When they are released I’d expect 10.6.2+ to have them.
That’s an odd statement. Macs are almost exclusively graphics workstations when deployed in the business world.
Heh, what’s funny is you’re both correct.
This is false, Nvidia and ATI provide the full driver per Apple’s requirements.
Excluding the open source argument; I wouldn’t use Nvidia simply their products are poor quality and have been so for many years. The only people who don’t seem to care about stability and quality are ricers and gamers who seem to change their hardware configurations more times than they change their undies.
I have it exactly the opposite; I’ve never had anything except trouble with ATi cards and I’ve found nVidia cards to not only perform very well but also be stable as a rock as well.
As for driver side.. well, nVidia drivers may be binary but they’ve ALWAYS worked like a dream for me and support all the functionality of the card in question, even old cards are still supported. But my old ATi cards..well, the last ATi driver that works for them doesn’t work with Compiz, it’s unstable as heck and is somehow oddly slow. The open-source one works otherwise okayish except I still can’t make TV-out work and the open-source one doesn’t support pixel shaders. The lack of support for pixel shaders totally blows.
One of the reasons that makes me live in Windows land is that even the binary drivers aren’t providing the same level as support as on Windows, being ATI or NVidia for that matter.
Even Carmack is referring to the current state of 3D drivers as a reason to stop caring about Linux in what concerns Rage.
http://ubuntuforums.org/showthread.php?t=1244727%22
And I could care less for funny 3D desktop effects. What I want is my 3D code to work properly.
Ditto. I’d still recommend nvidia to Linux users, entirely because of their binary driver. We should understand that their driver codebase is their “crown jewel” (they share the codebase with the windows driver”) and they are not giving that up lightly. But, in exchange we get a good (stable and fast) driver that receives much of the love dedicated to their money-maker (windows users).
There is no real need to get worked up about device drivers and open source. Hardware is expendable. When intel and ati get their acts together regarding the driver quality, we’ll have more choices, but nvidia is currently the safe bet.
A binary driver fails with the first kernel update.
If there IS a problem, a binary driver is impossible to fix (so one is reliant on the goodwill of the OEM).
If the OEM no longer sells the hardware, binary drivers for it will no longer be forthcoming from the OEM. “Planned obsolesence”.
Doesn’t make any sense. The could give out the source code of their driver to every single person on the planet, and it still wouldn’t run on an ATI card.
It doesn’t work on Linux. Nvidia have refused to fix a performance bug with 2D for over two years, for example. Because it is a secret, they could be being paid money to keep it poor on Linux.
Nope. Just plain no. Shun binary drivers. We now have specifications for, and open source drivers for, fully-funtional competitive-performance ATI cards.
Before the end of this year, people who are fortunate enough to have ATI cards and have Linux installed will enjoy by far the best-performing bang-for-buck desktop systems on the planet.
A binary driver fails with the first kernel update.
Actually, it usually just gets recompiled at boot. Atleast on my Mandriva it does.
Nope. Just plain no. Shun binary drivers. We now have specifications for, and open source drivers for, fully-funtional competitive-performance ATI cards.
And STILL the open-source drivers for older ATi cards lack all kinds of features whereas the nVidia’s binary-only drivers for similarly old hardware support all their features and work just peachy.
You can blather all you want about open-source superiority, but I have only been let down by the open-source ATi and nVidia drivers.
How can you have been let down by the open source ATI driver (the one built from the specifications) when it hasn’t been released yet? It won’t be generally available until distributions include kernel 2.6.32. Only this week kernel 2.6.31 was released.
http://www.phoronix.com/scan.php?page=news_item&px=NzUyMA
Any open source driver for ATI cards you have seen so far has been built by reverse-engineering. Not from specs. This is also true (and still is) for any open source nvidia driver. This is ALWAYS going to be disappointing.
Fixed now though, for Intel cards and for recent (R600 or later) ATI cards. These are both built from specs. The Intel drivers are written by Intel, and released as open source. The ATI drivers are not however written by ATI … ATI instead released the specs to open source developers. The drivers for these when they become available will be fully functional. And fixable. And “debuggable”. The ATI cards are far better performing than the Intel cards.
Nvidia cards won’t be anywhere near this race.
Edited 2009-09-10 13:21 UTC
Yeah, that’s a drag.
GPUs pretty much have the concept of obsolescence built in anyway, though I don’t think nvidia buyers have been suffering from this (the binary driver supports pretty obsolete cards).
It might still contain some “secret sauce” they don’t want ATI to see. I guess they value that sauce higher than perception among linux community, and that’s ok for me. It’s just a GPU, something mostly used for closed source stuff anyway (gaming).
I’m waiting with baited breath for good drivers and cards from ATI – if they make the cut, my next GPU will definitely come from ATI. Hopefully, we will see a change from the situation where “if you don’t have nvidia, you are on your own”. Until now, if you had bought ATI and wanted to go Linux, the general advice was to try to sell the ATI card and get an NVIDIA ;-).
Nvidia’s binary driver for Linux no longer supports “legacy” nvidia cards. This is quite similar to ATI’s binary driver. Both lack support for older cards.
Nvidia’s binary driver for Linux has had abysmal 2D hardware acceleration performance for years.
Nvidia’s binary driver for Linux no longer supports “legacy” nvidia cards. This is quite similar to ATI’s binary driver. Both lack support for older cards.
nVidia has a separate legacy driver for the older cards and they keep it updated so it works with new kernels and X versions. ATi doesn’t have such. A such, nVidia DOES support older cards.
Nvidia’s binary driver for Linux has had abysmal 2D hardware acceleration performance for years.
Interesting. I haven’t noticed such.
That you would admit to, Astortufer….
Not all cards were affected but the nvidia performance bug. My cards weren’t affected, at least not to a degree that I noticed in KDE4.
Sorry, but I’m using Linux for a _long_ time (about 15 years now) now and the most safe option for me was ATI and not NVIDIA. I’m in fear for the phone every 6 months when people start to upgrade there Ubuntu-installation and the binary drivers fails _AGAIN_.
Some people got smart and started to ask for a new machine, because the last one was failing. They found out that is now safe for them to follow the regulair updates and upgrades. And you can have a faster videocard, but when it fails it isn’t that fast anymore. Also for everyday usage you don’t need it, it doesn’t make your browser or spreadsheet any nicer.
This is also where the whole picture comes into view. Both AMD and Intel are gearing up to deliver a complete “inexpensive” and “efficient” platform and they need every customer they can get onboard. The best way to get normal people onboard is to get there geek friends onboard.
The current mATX/mITX-boards are good examples of this. Looking at the next Atom-platform or the upcoming Fusion platform you can see which direction everythings goes. Keep also an eye on how many Atom-boards are now shipped without any fans. This was different a year ago when fans where needed and software was not tuned for the new platform.
And I said it more then a year ago on this site as well. More and more people don’t care about 5 fps extra as long as their computer just works. And the next 5 to 10 years computers are here to mature and nothing else. Microsoft already found that out with XP that does basicly what people want and Linux has catched up over the years.
So mark my words. The moment that companies start to sell an inexpensive MacMini look-a-like with Ubuntu preinstalled it doesn’t matter which videocard is in it. Internet and computers are becoming slowly something like electricity, water, gas. It will always be there and that makes it the businesscase. You sell complete units in high volume.
This is also why companies are screening there code and specs to check for any legal issues. The last big dump of open checked code was OpenSolaris as it’s much easier and cheaper to maintain. In one year time they got ceritified support for over 2500 (!!!) laptops with only a handfull of developers. Companies like Intel and AMD can see a businesscase in this as they sell the chips and can now reduce cost on driver development and support.
So if you want to continue buying hardware with only binary drivers, then it is your choice. I stick to my policy that I have for a long time now. I prefer to exchange my money/goods with suppliers that respects my freedom. And what I have seen for people around me is that it takes between 12 to 18 months before they curse vendors for not respecting their freedom.
Just my 2c and ready for the next 15 years of freedom.
You realize that the end of this year is in under 4 months? I think it may be difficult to take the software from the current state to “best performing” in that time. Do you truly feel this is a reasonable expectation, or are you just marketing?
I installed Mandriva and it said “oh, an Nvidia.. shall I install the proprietary drivers?” and graphics wasn’t a concern again.
I installed Debian and grabbed the Nvidia drivers from there site. The downloaded binary looked at the system said “cool, shall I download and install the correct drivers?” and graphics wasn’t a concern.
I added the Debian non-free repository and grabbed the repository provided nvidia binary with the same result; 3d GPU happy out of the box.
My old ATI was never that smooth even under windows. Install the drivers and get mostly stable performance. Upgrading your drivers? Go through a long song and dance of uninstalling and reinstalling. Under Mandriva, the last time I saw tv-in supported was with a first generation Radeon AIW board and I had two after that with partial 3D out support at best.
I may have got lucky if the Nvidia hardware is hit or miss but the driver support has been a dream. Three different sources for install managed painlessly.. I’ll take it.
I’ve worked in several companies and industries where the stability and quality of the GPU is paramount and it’s been many years since I worked in a place that didn’t use Nvidia GPUs. These people are neither “ricers” or “gamers”, but professionals who need to be able to depend on their graphics cards, and they’re all happy with Nvidia (or at least more happy than they’d be with any other brand).
Wow… what an “objective” BS you just wrote there, backed by absolutely no evidence other than personal bias.
Given that NVIDIA defined, literally, 3D graphics in Linux. And the fact that up til recently ATi had mediocre support for the OS at best, and there are still plenty of features missing from the ATi drivers (which up to recently were just plain awful, in both Windows and Linux land BTW). Then yeah, it sounds like “you know what you are talking about” NOT.
Seriously, I never though I would see the day where someone would try to make a serious attempt at claiming with a straight face that ATI products work better under linux that NVIDIA’s.
A lot of people just don’t seem to get this.
I’ll try again.
Up until a few years ago, there were no open source drivers for Linux. Only proprietary ones. This presents a big problem, because the stability and the entire performance of the entire system depended on the OEM’s proprietary, secret, binary-only code. The Linux experts who coded the kernel and who would be the best people to debug any problems with drivers had no visibility at all into the graphics drivers.
So open source developers tried to write their own graphics drivers for Linux. In the dark, using reverse engineering, without specs. This is always going to be a slow, laborious process, only minimally effective. It is surprising the amount of functionality that was achieved.
The critical points here are these: (1) specs were NOT available, and (2) open source code was NOT written by the card manufacturers, (3) but code could be debugged, and (4) no danger of obsolesence through support being dropped.
OK, some while ago, this situation changed. Intel released their graphics drivers for Linux as open source.
The critical points here are these: (1) specs were available (to Intel staff), and (2) code WAS written by the card manufacturers, (3) but code could be debugged, and (4) danger of obsolesence through support being dropped.
This was a vast improvement, but still there were no specs. Still at the mercy of the OEM (Intel in this case). It is also a pity that Intel graphics are performance-wise significantly inferior to ATI or nvidia cards.
OK, early this year, ATI finally released the specs for R600 and later GPUs. Open source developers have been working on drivers since then (about eight months now).
The critical points here are these: (1) specs were available (to open source developers), and (2) code was NOT written by the card manufacturers, (3) but code could be debugged, and (4) no danger of obsolesence through support being dropped.
We are just now seeing the fruits of that coming through. These drivers represent an entirely new class of graphics driver, which has not been available in Linux before now. ATI cards are entirely competitive hardware-performance-wise. Finally they are going to enjoy a well-integrated graphics driver, written by people who know the Linux kernel and graphics systems inside out.
Well, now you have seen that day, it is finally almost here. The new code has been comitted to linux-next. “Working better” is precisely what this new class of graphics drivers will deliver. This is why people are excited about it.
PS: Obsolesence works to the advantage of a graphics card manufacturer. Card manufacturers enjoy a new round of sales (to ‘serious’ gamers) evey time Microsoft helps them out with a new version of Direct X. Think about what that fact means (to most end users) for a second.
Edited 2009-09-10 23:45 UTC
Does anybody know how the ATI FOSS drivers compare to frglx in performance? Last I heard the ope drivers did better at 2D but really do poorly at 3D tasks. Has the situation improved at all? If it had, then this is good news. If the FOSS drivers still worse 3D performance than frglx than this won’t be much of a big deal.
it’d still be a big deal, the drivers are open.
While the 3D support was being developed, I believe that initially they were using memcopy (software) function. One of the “todo” tasks was to write proper memory management and to use DMA.
http://www.phoronix.com/scan.php?page=news_item&px=NzQyNg
I’d imagine that that has been fixed by now.
Check near the end of this thread:
http://www.phoronix.com/forums/showthread.php?t=18959
Edited 2009-09-11 00:08 UTC
You can talk about how great open specfications are all you want, but theres simply no alternative to NVidia on Linux if you want to run 3D applications without stability or performance problems, today – for the rest of 2009, and almost certainly for 2010.
So ATI/AMD have (finally) released the specs. Gee, they’ve only been promising to do that since, i dunno, 2007?
So now we’re going to see horrible API churn, breakage and multi-level instability like with the Intel/Xorg/KMS work that has been going on recently.
And i’d estimate a driver that supports full and stable support for all the features on the ATI R600 cards with average 3D performance optimisation will be here by, say, christmas 2011.
Wake me when that happens.
Until then, at least you have a choice of graphics card with Mac or Windows – under Linux, NVidia is the only currently usable option if you want reasonable 3D performance.
I’ve got an Asus Z96J with an ATI x1800M card in it. I get pretty good performance both out of Compiz and out of Dawn of War: SoulStorm running under WINE, using the fglrx driver. I won’t say there are no problems, but it certainly provides “reasonable performance” under Linux.
I was using an x2400 at work, too, for that matter, and it was working just fine, until an RHEL4 kernel update went badly sideways. The card got replaced with an nVidia card; my (not exotic!) GL code no longer renders properly.
Edited 2009-09-10 23:03 UTC
This is un-necessarily pessimistic.
Please note that Intel graphics driver for Linux are written by Intel. They are not written by the people who are intimately familiar with the Linux kernel or the xorg graphics stack.
That is not the case with these ATI drivers. They should be as integrated with the rest of the system, and as in step with it, as is any other in-kernel driver. This is a first for Linux graphics drivers, really, we have not really seen this before.
As for how it performs … let’s wait for the benchmarks, hey.
Unecessarily pessimistic huh? I call it the plain truth.
You can make all the pep-talk posts brimming with enthusiasm to osnews you like, but its not going to change the fact that ATI drivers have been ‘coming soon’ for years, that the ‘about to be released’ drivers are only basically functional, and that its taken years for intel drivers, with intel support to get from ‘basically functional to fully functional and now back to basically functional.
I’ll believe it when i see it. I’ll believe that open source ATI drivers will support full speed XRender, GLSL and OpenGL 2+ functionality crash free and glitch free along with accelerated video playback on Linux when i see it working.
So why don’t you calm down. When that day comes, i’ll be able to forget my pessimism, and you’ll actually have something to crow about.
Pffft.
You can’t dismiss code until you have seen it, run it and measured it.
This driver is entirely new code. It is not fglrx. It is not a re-vamped version of the older reverse-engineerd open source drivers. It is the first 2D/3D accelerated graphics driver for Linux written by Linux developers with the aid of specifcations.
Lets wait and see how this entirely new code performs when it is made available. It is in the kernel staging area, but that means it has a lot of hardening and stability testing to get through yet.
Once we actually have a released driver, and we can objectively measure its performance, then and only then can we talk about the “plain truth” about it.
PS: ATI open source drivers have not been in work “for years”. The specs were only released to open source developers in January of this year.
Edited 2009-09-11 00:45 UTC
From David Airlie, the guy who committed the code:
“It may not be 100% stable yet and I’m sure we can make things a lot faster, but the basics all work for me here.”
So these drivers are an initial release – basically functional, but may crash, and are slow.
But sure, ignore the plain truth all you like, you’re way out in irrational fanboy territory here.
Cute.
I’m supposed to be the one who is “in irrational territory” … even though I posted this:
and you are the one who came up with the topic of this sub-thread:
… and yet I’m allegedly the one who is a “fanboy”?
Get real.
There is a word for this type of behaviour:
http://en.wikipedia.org/wiki/Psychological_projection
“May not be 100% stable yet” means that there is still testing to be done, it doesn’t mean that it is necessarily buggy.
“I’m sure we can make things a lot faster” means it probably hasn’t been profiled and optimised, it doesn’t mean that it is slow.
I posted earlier quoting other testers on the Phoronix radeon forum … it was decribed as “fast and stable” for them.
So your description that it will be “basically functional, but may crash, and are slow” is purely wishful thinking on your part.
I gave direct quotes from independent people who have run David’s code. They found it to be QUOTE: “fast and stable”. QUOTE: “stabilizing quickly”. QUOTE: “I am *still* surprised how well it all came together.” You are just guessing and speculating about it, and badmouthing it before it has even been through staging.
What precisely is your agenda here?
Edited 2009-09-11 04:42 UTC
I thought you wanted to wait for the benchmarks before passing judgement?
Just keep digging your hole mate, its pretty funny.
WTF?
Those weren’t my judgements, they were judgements made by people who had run the development versions of the driver code.
What is wrong with you that makes you such a cranky sourpuss? Exactly what barrow are you trying to push?
Although not defending anything said already, software that hasn’t been fully tested IS going to be buggy. This is how software is, there’s no blame in saying something is buggy when it hasn’t been tested, it’s just the truth (based on experience).
I just think your self-contradictory ranting is amusing, and your obvious desire to have the last word, regardless of the facts, or lack of them is good for a troll.
I don’t have a barrow to push, i’m just stating a simple fact. The latest developments in ATI drivers are interesting, but they aren’t going to bring ATI performance up to par with NVidias any time soon.
Thats not a bashing anybody, thats just the way it is.
If you want solid 3D performance on Linux, your only choice is NVidia.
Do you want to refute that with some kind of facts-based assessment or benchmarks, or are you just going to call me names, accuse me of having an agenda, post irrelevant stuff from microsoft, and keep flip-flopping on your previous statements?
Maybe you could just point out where i’m actually wrong, rather than accuse me of ‘unnecessary pessimism’ being a ‘cranky sourpuss’ something about lunch money, and having a barrow to push.
Look, if i’m wrong, tell me how i’m wrong. Whats my other choice for solid 3D on Linux, today or in the next month other than NVidia?
By that I mean I run Blender, write 3D apps based on OGRE and watch movies that require playback in anything up to 1080p. My sepcific application requires OpenGL 2, FBOs, GLSL shaders and the ability to provide 60fps+ framerates when dealing with texture-heavy (~512MB textures) scenes. My Blender work is polygon heavy but actually doesnt require much except stable, glitch free OpenGL 1.x support. Having compositing desktop effects would be a bonus.
Currently I use an 8800GT, under Linux and OS X.
Please, cut the crap and tell me where I am wrong when I say that the only choice for solid 3D on Linux is NVidia
AMD’s binary drivers actually aren’t much slower than NVidia’s. And they work well enough on the enterprise distros, you just run into bugs on Ubuntu and other more up to date distros. OSS drivers are still a ways from being ready for you, but give it time. They’re making it work well for the average user who just wants Compiz working first, and then the more complicated features and performance will come later.
Cut the crap yourself.
It is very simple. To get Nvidia 3D on Linux, one has to install a proprietary binary blob driver from nvidia.
There are at least 3 entirely sound reasons why that is a terrible choice:
(1) the kernel developers don’t like this at all, they see it as Nvidia sponging off their work and not respecting their license (which is GPL). On more than one occasion, kernel developers have come very close to cutting off the mechanism in the kernel that allows Nvidia to load a closed source binary blob driver (which is known as “tainting” the kernel). So as the owner of a machine, one could very easily find one day that ones video card binary blob driver no longer would load and run.
(2) Nvidia, for reasons of their own (perhaps for example with obsolesence in mind, to force people to have to re-purchase a newer video card), could at any point withdraw support for Linux. Or more subtly, nvidia could simply refuse to fix or even acknowledge a bug in their driver. This too has already happened more than once.
(3) Finally, a binary executable blob is an excellent place in which to “hide” functionality, or restrictions on functionality, that are interests of the software supplier rather than the interests of the owner of the machine. If, for example, one (as a corporate software supplier) wanted to (or was paid to) “degrade” video that had come to a video card from a source such as a blue-ray disk via an “untrusted” path (or OS), then one could easily build such a degradation right into the binary blob video driver, and the machine’s owner would not be able to do anything about it, even if such a degradation was not in his or her wishes.
Happily, from the end of this year forward, there is now going to be a way where all of those bad things can be avoided, and yet one can still have top class (in terms of hardware cost/performance) 3D graphics hardware and driver working on Linux.
That alone means that Nvidia is NOT the only choice for 3D on Linux.
In fact, from the users perspective, Nvidia is now suddenly a distinctly second rate chice on Linux.
Edited 2009-09-12 05:12 UTC
Actually, no. NVIDIA is still the way to go. I made the mistake on trying ATI again, and they are still crap under Linux. At least with Nvidia I actually had drivers that worked. ATI has always been slow under Linux, and the open source drivers prefer to crash X instead of recovering gracefully, requiring a hard shutdown of the machine. Yep, that is definitely end user friendly.
Look, I love Open Source, but it is not the solution to every problem. Graphics is one of them.
Actually, no, Nvidia will no longer be the way to go.
ATI’s drivers for Linux have indeed historically been bad, but now there is coming an alternative driver for ATI cards which is not from ATI. Because the cards themselves are fine, as seen on Windows, any historical anecdotes you may wish to recite about ATI drivers on Linux being poor is no longer relevant. This is a new driver entirely, it is NOT fglrx.
However, the points I make about binary blob drivers have always been relevant, and they are still applicable to nvidia cards.
Edited 2009-09-12 05:23 UTC
Come back and tell us when it is here and works well. Enough of this “Just wait! You’ll see!” hand-waving.
ATI helped the OSS community out with documentation back in the old Radeon 7000 days… and the drivers were still incomplete crap for years and years and years.
For whatever reason, despite all the claims and promises from the true believers, FOSS and video drivers have historically been an embarrassing disaster, even with proper documentation. So like I say, come back when you have something concrete to compare to NVidia’s drivers. I certainly don’t like the fact that they are proprietary. But they have always been of sterling quality, despite all the FUD that gets thrown at them by the Bible thumpers.
Sterling quality? No. Just no. For over two years 2D hardware graphics acceleration had abysmal performance in Nvidia binary Linux drivers. It was in fact a decelerator … software rendering was faster.
Earlier this year, Nvidia were for a couple of months releasing new drivers more often than once a week to try to overcome bug after bug after bug. In one particular week there were three releases, I think.
As for this comment, the new drivers are in the kernel staging tree.
http://git.kernel.org/?p=linux/kernel/git/airlied/drm-2.6.git;a=sho…
They exist, and they basically work, although some areas such as power management and kms are still being worked on.
http://www.phoronix.com/scan.php?page=news_item&px=NzUzMA
http://www.phoronix.com/scan.php?page=news_item&px=NzUyOA
Being in the kernel staging area means they will be delivered with the next release of the kernel, although there is still a lot of testing and polish to be applied to them between now and then.
However, this is not “hand waving” of any kind, these new drivers do exist. You can pull them from git and compile and run them yourself right now if you wish to lend a hand with the testing and polishing effort.
If you are not prepared to help out and contribute something positive with respect to these new drivers, then the very least you can do to help would be to cease and desist from spreading false negative FUD about them, thankyou very much.
Edited 2009-09-12 06:27 UTC
Yawn. More hand-waving. More incomplete drivers which aren’t even out yet unless you want to jump through unsupported hoops.
Like I say, let us know when your Great Pumpkin rises out of the pumpkin patch. I’m sure that lots of people would be interested if and when it really happens.
It’s certainly not NVidia’s fault that a certain FOSS DE project made a *really* unwise design decision.
Edited 2009-09-12 06:24 UTC
Do you want them to be tested or not?
They are written, they compile and run without crashing (so far), but they are not yet tested. That takes people and time. This is the state of play, and all that I am doing is reporting it. I’m not trying like some to undermine it.
http://git.kernel.org/?p=linux/kernel/git/airlied/drm-2.6.git;a=sho…
It is really happening, dude. Right there, right now. None of your filibuster can stop it.
Do you mean KDE4?
The Xrender specification was published in 2000.
http://en.wikipedia.org/wiki/Xrender
It was claimed to be supported by all video card GPUs and drivers for over eight years now.
It isn’t the fault of any FOSS DE project that Nvidia drivers were totally unable to get this to work for YEARS (when everyone else had no problems whatsoever) … but ONLY in Nvidia’s binary Linux driver (for some models of card) was this a problem, it works fine on the same Nvidia cards in Windows, and it works fine on reverse-engineered FOSS drivers for the same Nvidia cards.
Edited 2009-09-12 06:44 UTC
kwin compositing works fine in current binary drivers as well.
Also, there seems to be a bad communication impedance mismatch here. You keep thinking that people are opposed to ATI drivers in principle (which would be somewhat absurd), whereas the problem here-and-now is that the new and glorious ATI driver is not really there yet. *Right now*, nvidia is the one you’d rather use on Linux.
ATI might have the upper hand when .32 kernel is deployed. We’ll wait and see (and root for ATI).
But it didn’t for such a loooooong time.
Not at all. *Right now* there is already a new open source ATI driver which works for 3D. The situation *can’t* get any worse than that … since it is open source. If needs be we can always revert back to the source we have now.
It is like that with open source … once it works even once, partially, you can’t have a error by the OEM (intentional or otherwise) screw you. You can always revert a version. You can always fork the code.
Fair enough. For me it is no contest … Linux using ATI graphics cards can’t be held up any longer by recalcitrant or incompetent OEMs. The functionality will come, it can’t be stopped. The self-interest of an estimated 1.5 million Linux developers will see it continuously improve. Any PC manufacturer wanting to ship Linux systems, from this point on, knows that they cannot be forced into having to ship a system with crippled graphics … just install an ATI card and it will be good.
Edited 2009-09-12 12:47 UTC
I want them to be tested. And I want them to work for real professionals in the field who require high quality 3D drivers. You are asserting *way* before the evidence is in that they will test out and fit that bill. And I’m saying that we’ve been here before with ATI FOSS drivers and the FOSS community failed miserably. For some unknown reason, 3D video drivers and FOSS have historically gone together like peanut butter and tuna fish.
In your zealous advocacy of FOSS, and your resultant over-optimism, I think you are jumping to unjustified conclusions.
Well … it does work. There is a driver. FOSS drivers for older ATI cards are stable and perform well. Drivers for these newer ATI cards covering everything other than 3D have been working and stable for quite some while now. ATI did supply both specifications and example code. There is a software rendering reference library (Mesa3D).
http://www.mesa3d.org/
The new ATI driver is in the kernel staging area. There is considerable work going on in this area right now. There are a lot of people trying to test and polish it. Given the source code, it can be debugged. There are examples of how it is supposed to work. It can easily be benchmarked and profiled. Parts of the code that aren’t tied specifically to the hardware can be re-used from other projects. Literally hundreds of drivers have gone through this staging process before with far less tools and resources in support.
I’m sorry, but I can’t really see any grounds for un-necessary pessimism.
Edited 2009-09-12 15:20 UTC
Lemur2, for years you have cherry-picked *one* esoteric feature of the NVidia drivers that happens to cause problems with one DE, which happens to be your favorite one. And for years you have declared them crap because of that one thing. But in the case of this new FOSS ATI driver, the mere fact that it is about to be released and got through some tests without crashing too much or performing too badly is enough for you to declare victory before it is even out the gate, errrm… out of “staging”, even though you admit it is still lacking features… as the old Radeon 7000 driver did for years and years despite ATI providing documentation to the FOSS developers at the time.
Now… maybe the new driver will surprise me and become complete and stable in the next year or so. And maybe not. I hope it does. But based on the historical evidence, I’m not holding my breath.
Wait to count your chickens until *after* they are hatched is what I’m saying, Lemur.
And if you truly are so certain that they all really will hatch, then you should have no problem exercising a little patience.
Edited 2009-09-12 15:34 UTC
I have given sound and valid reasons why closed-source drivers are utterly undesirable:
http://www.osnews.com/permalink?383631
These reasons are not necessarily related to any particular DE, nor any particular vendor for that matter.
I have given the circumstances around this particular driver, noting that it has been stable and fast for everything except 3D for quite some time now, and also noting that all of the information required to add that 3D functionality was supplied to the developers, and the process of adding it went very smoothly and surprisingly quickly.
A number of people have already tried it, and they report it to be fast and stable.
The kernel people have committed to having it in the next kernel immediately after it was posted to the kernel staging area.
These are all exceptionally encouraging signs. I think you have no idea whatsoever about the “historical evidence” for this code.
The best you can do in reply is cry “don’t count your chickens”?
Oh dear.
The eggs are already hatched and the chickens are healthy and running all over the barnyard. They just need a little more fattening up, and they will be ready for consumption …
Edited 2009-09-12 15:55 UTC
Which is completely irrelevant if we don’t have a working alternative.
It might be relevant, once we have the open ATI drivers available, but we don’t at the moment. And NO some code in a git repository, isn’t something I consider available.
I’ve an older ATI card, which is more or less useless under Linux. I ended up buying an nvidia card, even though I didn’t want to infect my system with a binary driver, because I had to be able to get some work done.
I wish I could tell you the chip, but for some reason MSI didn’t think it was important to print the model on the board.
At least the nVidia driver works for the most part (and I never had the problem with nvidia and KDE4 on any of my machines). I have hardware x264 full-HD decoding and working 3D acceleration on a card I paid about $40 for.
What ATI card with which currently available drivers, can do that? I would love to ditch the closed source driver so I once again can have a fully open source system.
I’ll be looking forwards to 2.6.32, but until when I will stick with something I know works.
The only issue is the time it is going to take. The 2.6.32 kernel will not be out for awhile yet. I will however be glad to see it. Until that kernel is released, whether it is in the branch or not, it still remains to be seen on whether it will support all features of the card.
As for binary blobs, I am not an OSS purest. The Nvidia drivers always worked for me, and I never once ran across the KDE bug, even though I use KDE. I do know it exists, I just never saw it.
This is fair enough. The only comment I would make is that one does not have to be an OSS purist, nor hold to any of the ideals of FOSS at all, in order to be able to appreciate the benefits of having control over ones own machine, and not being subject to the whims of a single software vendor company, who may or may not support your needs.
the benefits of having control over ones own machine, and not being subject to the whims of a single software vendor company, who may or may not support your needs.
If you can’t fix any issues with the code or do some coding yourself then you’re still subject to the whims of whomever is developing the driver, OSS or not. So you’re still not in control of your own machine.
If FOSS is about scratching itches, then the area of FOSS 3D drivers is like rolling naked in poison ivy. You can scratch all you want. But it never does any good, and often makes things worse. And relief is typically a long, long, time in coming.
Edited 2009-09-13 02:00 UTC
There were two or three different FOSS projects developing a driver for ATI garphics cards, given the documentation from ATI.
There was the radeonhd project at Novell (I think) and then this project for which the driver was called radeon. I think there may have been one other.
This project ended up being a merge of a few different codebases, AFAIK. There were a couple of different approaches taken.
So no, with FOSS, you are often not dependent on the whims of one developer. The best effort often gets there first, amongst a couple of different efforts or sometimes a handful.
It is a meritocarcy. The best solution (as determined by what works best and what answers the most needs) wins. Other solutions either drop by the wayside, or go on to become an alternative choice.
Edited 2009-09-13 08:15 UTC
Where do you get this from? I fail to see how these recently committed ATI drivers, which are buggy, slow and incomplete get you to this position? These drivers do not expose ‘top class’ functionality, they provide the basics – that is minimal acceleration and enough stability for testing purposes.
I also see no claims made by the developers of these drivers that the end of the year wll see the performance targets you claim. Are you just making this up?
You make totally unfounded assertions. The intel drivers are open source, but their performance is awful, and the architectural changes in Xorg, the intel drivers and the kernel has made the performance even worse.
Why buy hardware that is supported by Linux, only to have it arbitrarily broken by X.org and kernel devs? The supposed superiority of the open source model in this case goes right out the window.
Youre just grasping at straws, unable to come up with any evidence whatsoever that anything other than an NVidia card will provide solid 3D under Linux.
Some people might prefer an open driver for philosophical reasons, but that doesn’t address functionality at all. I’ve yet to see any open source 3D driver come close to NVidia’s proprietary driver in terms of speed and functionality on Linux.
Clearly you would rather buy hardware that has an open driver that doesn’t work very well, than hardware with a closed driver that works really well. Thats OK, but stop making out like its a real alternative for people who need to get 3D work done.
I’m still waiting for some kind of facts-based assessment or benchmark that shows there is any alternative to NVidia for solid 3D on Linux.
Edited 2009-09-13 20:51 UTC
I think the real disconnect everyone is having here is based on the word “solid”.
Does solid mean, “working out of the box without any configuration or installing 3rd party software that might break with any kernel upgrade”?
Or does it mean, OpenGL 3.2 support that’s extremely fast?
I think the vast majority of Linux users would be quite happy with the 1st definition, as long as they can get all their Compiz effects working well. But clearly you need to stick with the binary drivers for now if you need definition #2.
That is the definition I am working with. Absolutely fine for over 90% of use cases.
That level of support will come a bit later.
http://www.phoronix.com/scan.php?page=news_item&px=NzUzMw
http://www.phoronix.com/scan.php?page=news_item&px=NzUzMA
http://www.phoronix.com/scan.php?page=news_item&px=NzUyOA
http://www.phoronix.com/scan.php?page=news_item&px=NzUwOA
http://www.phoronix.com/scan.php?page=news_item&px=NzUxNw
A fair amount of this stuff isn’t specific to the r6xx/r7xx driver at all.
I think the real disconnect comes from this statement by lemur2:
Does anybody but lemur2 believe this to be true?
What is wrong with the statement?
Individuals can have a complete 3D working desktop by buying a (blank) hard disk, some memory, a motherboard and case and an ATI graphics card. The more memory, the better the CPU and the better the graphics card, the better the performance.
Install a “performance” vesrion of Linux from a LiveCD (costs $5 bucks via the post from a Linux CD distributor, and it takes less than 10 minutes of your time to install), perhaps Arch Linux 64-bit with KDEmod, and such a system will then have all the very latest 3D desktop “bling” (very pretty), a completely functional, fast and secure OS and a complete set of desktop applications including office suite for almost no extra cost (in time or money) beyond the hardware.
The software included would cover 90% of use cases. It would perfectly suit most people’s needs for a desktop system.
You won’t be able to get any other desktop systems that perform nearly as well and cover the same functionality for anything like the price. In my country such a system might cost half the price of anything you could get from a store with the equivalent power and functionality.
As I said … “the best-performing bang-for-buck desktop systems on the planet”.
“Bang-for-buck” is a metric that very much includes price.
Because systems with Nvidia cards and drivers outperform the ones with ATI cards and drivers by a large margin?
Meaning that an NVidia system offers users far more graphics performance and bang for the buck than an ATI card that isn’t properly supported under Linux?
Where is your evidence for that?
I think you are very, very confused here. When one purchases the $5 LiveCD from the post, it won’t have anything other than a barely functional driver for a Nvidia card. One might have to start initially with Vesa mode. Eventually, if you know what you are doing, with patience you may be able to get a proprieatry nvidia driver installed, and finally get 3D working (as long as you don’t update your kernel after that), but that will cost you a fair amount at least of your time if not $.
Even today the driver that comes with the Live CD for an ATI card, although it won’t support 3D, it will be far better supported and have great 2D performance (which is what you need mostly for desktop use). Towards then end of this year, or maybe early next year, if you did this, the nvidia card would still be barely supported but the ATI card will work great AND include 3D functionality.
Edited 2009-09-14 02:37 UTC
Yeah, yeah.. so now we’re down to:
If you don’t care about 3D performance, and don’t run any apps that use OpenGL or don’t care that they run slowly, then sure, open source ATI drivers might work for you at some point in future. They will probably mean things will work out of the box, though they will certainly not offer the performance of proprietary drivers for some years.
I doubt these open drivers will be production-ready and rolled into distro liveCDs by the end of this year, but if you want to believe that, then whatever.
Where is your evidence for this? You have no evidence whatsoever that the ATI cards would perform slowly.
And remember, we are talking about performance in terms of bang-for-buck, remember? I’m not necessarily talking about the absolute best performance available, but rather the best performance per dollar expended. That probably would turn out to be the very cheapest ATI and nvidia cards, not the high end ones.
http://www.kernel.org/pub/linux/kernel/v2.6/
Four releases already since December last year.
These drivers are due out in kernel 2.6.32.
Like i said previously, i’ll believe it when i see it.
But take a look at some of the latest testing of these radeon drivers:
https://fedoraproject.org/wiki/Test_Day:2009-09-09_Radeon
Look down the ‘GLX’ test column and se how many FAILs and N/As (due to the driver failing to even get X running) there are.
This driver is a long way from providing solid 3D.
Note that there are actually 4 packages that all have to work together to provide the 3D acceleration. The kernel is one, and that code will almost certainly be released as part of 2.6.32 before the end of the year, although its in the Staging area to demonstrate it’s not considered completely stable yet.
The others are libdrm, mesa, and xf86-video-ati, which all support 3D in their master branches now. I wouldn’t be surprised to see them all push out a release for Christmas, but it’s possible you may still need to use unreleased code for a while there. I’m sure it will depend on how well the more widespread use of the driver goes until then.
Edited 2009-09-14 04:41 UTC
“Production ready” is relative, but it’s going to be in Fedora 12 released in November. The notice that started this whole thing says that he’s adding it to rawhide: http://airlied.livejournal.com/68097.html (Dave Arlie works for Red Hat).
It will almost certainly be buggy, but it’s not useless. It’s already working well with: desktop effects, Google Earth, Open Arena, Nexuiz, SuperTuxKart, Half Life, and Deus Ex (in WINE).
Anyway, even if you aren’t sold on these drivers ever getting as good as nvidia’s binary drivers, you should at least recognize that this is good news for all the platforms that they don’t support. Linux PPC/ARM, or Haiku, for example.
I think its great news, I just disagree with the hyperbole used by lemur2 to describe these developments, the proposed timeframes for seeing a performant and stable driver wildly optimistic, and I think that if you want high-performance 3D on Linux, NVidia is by far the best solution currently.
Agreed, although I’m not quite as pessimistic as you are. I don’t think the OSS drivers will ever match AMD/NVIDIA’s binary drivers 100%, but I think we could have something by next Christmas that is reasonably full-featured and fast.
I don’t think you can necessarily blame the fact that the drivers are OSS for why they have come about so slowly. I mean the NVidia drivers work pretty well, but the ATI ones were horrible for years. And the Intel drivers on Windows were very bad when I tried using them once. I don’t think others were much better – I heard terrible things about Matrox’s drivers as well. It just takes a lot of time and effort to really optimize your drivers fully for hardware as complicated as a modern graphics card.
http://www.phoronix.com/forums/showpost.php?p=91985&postcount=96
http://www.phoronix.com/forums/showpost.php?p=92070&postcount=98
Yes, I saw that myself. Those 2 cases are, in order: a relatively easy task that doesn’t require very advanced drivers, and a bug in the fglrx driver (or wine) that allows this new one to work at a similar (DX7) low level that again doesn’t require very advanced 3D.
You can definitely pick out specific cases where the new drivers work as well as the binary ones, for example I think the 2D acceleration and general desktop feel are probably superior. But I stand by my original statement, which was that for the most part I doubt the “difficult” things like 100% performance equivalence will ever be met. That would require things like program-specific optimizations like the binary drivers have that I don’t think the OSS drivers have enough manpower to implement. And that’s not just my opinion, the developers working on the drivers are the ones saying it, that they think 80% speed is a reasonable target.
My point was best “bang-for-buck” performance, not best high performance.
Perhaps your misreading of what I said was why you are flying off the handle on this topic … but I’m still inclined to think you have a nvidia barrow of some kind to push, and if not that then you probably have a “proprietary is better” barrow to push.
Edited 2009-09-14 05:45 UTC
Thats not what you said. You went back and edited the comment to erase what you said.
You said:
Among other statements which give a very optimistic timeframe and overstate the current capabilities of these drivers.
If youre gonna spread misinformation, then mare a pathetic effort to cover it up expect to be called on it. Expect to be asked to back those statements up with facts, and expect to get pushed back when you push others.
Youve also accused phoronix (who i notice you quote a lot from to back up your positions) of shilling, Keith Packard of not being up to the job, and basically been a totally agressive ass about this whole thing, and then make out like its everyone else with the problem.
And just beause i don’t share your ‘Proprietary drivers are EVILEVILEVIL we must banish them even if it means accepting terrible performance’ doesn’t mean i have a ‘barrow to push’ or that anybody stole my lunch money, or that i’m a cranky sourpuss.
Theres the reality of this situation with these new drivers, and theres the fantasy world you’re portraying. Perhaps we can meet somewhere in the middle?
And perhaps you could try to be honest rather than revising your posts after you make them to fit with your new claims?
Highly doubtful. Lemur does more than just portray fantasy worlds. He lives in them. And he does so because he *wants* to live in them. It’s very hard to imagine a meeting in the middle under those circumstances.
And I assume that you realize by now that you are wasting your time.
It reads as what I meant to say. Best prefromance for bag-for-buck. It is right there in the text you quoted. I have no memory if I edited what I first posted or not, but often enough I do that because when I re-read what I first wrote, it is not quite what I meant. There is only a short “window” of about 20 minutes where one can edit posts, so I may have posted something, read it, decided that that was not quite what I meant or wasn’t quite clear, and so I corrected it for clarity.
Anyway, that is undoubtedly what these drivers will deliver.
It is also what most people want … value for money performance, rather than top performance at any cost.
Note that I say MOST people … by no means do I mean all people.
Pfffft.
I have no idea why you have decided to misinterpret my meaning, but my meaning is there.
If I have “called you names”, it is only because you have inexplicably decided to try to attack me with a series of nasty posts such as the one this is a response to. I make no apology for retaliation. Don’t attack people in the first place would be my advice to you.
Edited 2009-09-14 23:21 UTC
What the heck? The Intel graphic driver for Linux are written by none other than Keith Packard, the project leader of xorg. All the new stuffs like DRI2 and KMS are also developed by Keith and co.
Please get your fact straight.
OK, fair enough. How is it then that Intel drivers for Linux have gotten themselves into such a horible tangle recently? Performance regressions and dropped functionality all over the place.
http://www.phoronix.com/vr.php?view=14082
Perhaps xorg needs a new project leader.
Edited 2009-09-11 00:53 UTC
OK, fair enough. How is it then that Intel drivers for Linux have gotten themselves into such a horible tangle recently? Performance regressions and dropped functionality all over the place.
http://www.phoronix.com/vr.php?view=14082
Perhaps xorg needs a new project leader. [/q]
No the problem seems to be that of ignoring either the older hardware or not having it around to test things on.
As usual, we have sites like phoronix to thank for these kinds of messes.
If they weren’t so busy shilling for people to run out and buy the latest and greatest overpriced ATI and Nvidia card, doubt that there would be the problems there are.
Oh yeah.
1. Phoronix stops ‘shilling’.
2. …
3. 100% working Intel drivers!
Thats even more ludicrously unhinged than lemur2s overenthusiastic ranting.
How on earth is my saying effectively to “wait for the results of benchmark testing” supposed to be ranting?
Why all this irrational attack and vitriol? Did someone steal your lunch money?
Edited 2009-09-11 03:10 UTC
Cos you’re not saying ‘lets wait for the benchmarks’ you’re writing post after post saying this is some kind of enormously significant milestone and we’ll all have top notch ATI performance in the near future, without a shred of evidence to back it up.
And youre calling for Keith Packard to step down because of the intel graphics situation.
You trumpet the supposed superiority of the OSS driver development model, and in the next breath rail against its outcomes and insult one of the most important contributors to the effort.
Thats unhinged ranting, plain and simple.
… and dozens of missed xorg milestone release dates.
Edited 2009-09-11 03:25 UTC
Microsoft’s marketing presentations are the “unhinged ranting”.
http://www.desktoplinux.com/news/NS3377584759.html?kc=rss
The good news is that, buried in amongst all the attempted marketing FUD soundbites, at least the good side has got the OpenGL patents back.
I have no answer for that. You are the one that defends open source graphic drivers. You should provide the answer, instead of asking the question.
I have a nomination:
http://www.phoronix.com/scan.php?page=news_item&px=NzUyMg
I’m looking forward to Xorg 7.5 and open source ATI drivers coming out perhaps by the end of the year. Then maybe the recent seems-almost-sabotage of X might finally be coming to an end.
To everyone asking when the code will actually be available (and not happy with git repositories), this code will be backported into Fedora 12. I’m not sure about the other fall distros, but I think it could potentially be in some others as well.
I think F12 will probably enable it by default, although I’m not sure about that either.