“In continuation of our previous piece entitled ‘Ati: a Year in Review 2005’, where we looked at ATI’s features implemented this year into their Linux drivers as well as thoroughly examining the frame-rate performance, today we have turned the tables yet again and are taking another look at NVIDIA’s gains this year. In addition, due to popular request, and keeping with the standards set by the previous ATI article, we will also be comparing our results against that of the latest NVIDIA ForceWare Windows display drivers.”
NVIDIA Linux Drivers simply rocks !!!
I never had a problem with NVIDIA Linux Drivers …
Well, good for you!
I’ve had probelms with…
…the driver not coping with Composite very well.
…being unable to correctly detect the pixel clock for my WUXGA DFP.
…stability problems where the kernel would panic upon driver loading (related to the above).
…general jerkyness of the UI.
…requests for support being met with: “can’t support non-open source software, go ask nvidia”.
Just wanted to add some contrast. But sure, gaming is great 😉
This pair of articles were really good, and investigated something that needed investigating. My only complaint is that it would have been nice of SPECViewPerf had been tested, to see how workstation-level performance compares between the two driver sets.
I have some nvidia graphics cards, and had no problems with running them under linux.
What I don’t understand is why doesn’t nvidia release specs for older “legacy” gpu’s that aren’t even on the standard driver releases anymore (riva’s, geforce’s 1 & 2, see list here http://download.nvidia.com/XFree86/Linux-x86/1.0-8178/README/append… ), so at least the community could ship those drivers by default with the X server, and update them if needed for things like Xgl and Exa… That’s pretty much the last thing holding me back on buying and recommending more nvidia’s, although I pretty much have no other (sane) choice right now, if I want a recent gpu and no problems setting it up.
After having a ATI 9600 Pro in my last notebook, I’ve got a Nvidia6800go in this one and couldn’t be happier (plus the laptop(I9300) is upgradeable to the Nvidia 7800).
I’ve noticed a trend though. All the high end notebooks now have Nvidia cards as the high end selection and ATI as the bargain selection.
Here in the ATI thread I rip them for spending five pages on near uniform benchmarks on the ATI without touching on nVidia… guess I was just a little too impatient.
Of course comparing the two articles still shows that level of Bias… right off the ATI test was done a 1.86ghz P4M with a X300, toshiba slim 100mb (notoriously slow drive) while the nVidia test was done on a A64 3000+ (which they report as 2.25ghz… news to me, my venice is only 1.86ghz) with a 200meg SATA Seagate.
That they use the Ge6600 (which I mentioned in my rant on the other thread because it’s what’s in one of my machines) against the X300 shows either extreme bias or just plain lacking the finances to do a SERIOUS review. If you are gonna do a meaningful review on the driver status, it would HELP to have video cards that are at least CLOSE in performance… like say comparing a X300 to a Ge5800 or the Ge6600 to a X800 (standard).
Otherwise both articles are fairly meaningless, apart from showing that pretty much under both vendors 3D graphics are slower under linux than windows… Article is talking about 1-2% speed gains in the drivers as a big deal when the final benchmark difference between linux and win is closer to 20-25%.
GAC
The point of the article wasn’t to compare ATI to NVIDIA, but to compare each vendor’s Linux drivers to their Windows drivers, and to compare the performance increase of their drivers over time. Thus, it doesn’t matter which ATI and which NVIDIA card they use, as long as they use the same one in all the driver tests.
The delta between Linux and Windows is a very difficult number to guage properly. The problem is, these games aren’t running the same code on the two platforms. Doom III doesn’t use SSE in Linux, for example, while it does in Windows. The Enemey Territory benchmark is probably the best one to look at, since out of the three engines represented, the Quake III engine is the one most well-optimized for Linux and Windows. That’s also the reason I wanted to see SPECViewPerf results — in that case, at least, the exact same code is running on the various platforms.
Quote:Doom III doesn’t use SSE in Linux, for example, while it does in Windows. The Enemey Territory benchmark is probably the best one to look at, since out of the three engines represented, the Quake III engine is the one most well-optimized for Linux and Windows. That’s also the reason I wanted to see SPECViewPerf results — in that case, at least, the exact same code is running on the various platforms.
Which is why I’d at least like to see the same CPU across the board… Since the speedups seen on the nvidia side with drivers COULD be present on the ATI side if you have an A64 chip present like the nVidia testbed did. The difference COULD simply be the different CPU. (It probably isn’t, but it’s something that SHOULD have been tested)
and why the different target audience GPU CAN have an effect. Just because the driver came in the same tarball doesn’t mean it’s the best one to compare. If you think ATI is going to put ANY effort into optimizing for the low end bargain basement entry-level X300 (which is BARELY equal to the 9200 if benchmarks are to be believed) you’re nutters. X800 or X850 would be more meaningful. Frankly that they even TRIED Doom3 on the X300 makes those numbers a total joke… there’s so little gpu time available the processing cutoff could be causing a flat result.
Oh, and the Q3 engine at this point uses so little of a modern video cards capabilities I would hardly consider it a good benchmark. HL2 and Doom3 at minimum… Besides when I can run W:ET at 1600×1200 at 4xAA, 8xAF at greater than 100fps in BOTH linux or XP on a Radeon 9800 Pro, who gives a @#$% about the benchmarks for that.
The recent .8xxx series of drivers has really improved over the previous versions. For the first time since .6111 everything works on my machine without a hitch. I’ve always had problems with either hardware acceleration locking the computer up, not being able to use glx and another random feature at the same time, udev problems, and most noteably not being able to switch to a VT that is set to use a framebuffer. But they seem to finally have all their ducks in a row with the latest series.