Linked by lopisaur on Fri 25th Jun 2010 22:21 UTC
Ubuntu, Kubuntu, Xubuntu Based upon a recent email to the X.Org developers' mailing list, Canonical is nearing the point of one of their goals for Ubuntu 10.10 of a rootless X Server, or being able to run the X.Org Server without root privileges.
Thread beginning with comment 431710
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[7]: Big deal...
by UltraZelda64 on Sun 27th Jun 2010 08:43 UTC in reply to "RE[6]: Big deal..."
UltraZelda64
Member since:
2006-12-05

"If I remember correctly, these window managers only work with 3D acceleration turned on--what happens if you use 2D-only drivers?

If you don't want to use proper drivers, you don't really have the right to complain that things don't work well for you.
"

What if it's running on decade-plus old hardware that doesn't have a worthy GPU for processing 3D? Maybe the device vender no longer even supports that particular chip, which is highly likely; nVidia seems to have gone through three or four different, incompatible driver generations in the time I've owned this nearly decade-old machine (though its original GeForce2 Ultra has long been replaced). The original GF2:U is now in another machine (currently running Windows and not in my possession, and if I were to install Linux or BSD on it, I would be at least three driver generations behind. No telling whether the drivers would even *work* on modern systems/kernels.)

Or what if using "proper drivers" goes against your wishes, such as using third-party kernel blobs and drivers? Hell, for that matter, what IS a proper driver--the crap nVidia and ATI put out, or something that is more well-designed to fitting into the system as a whole, both in terms of design and philosophy (open source)? Or just the "appropriate driver to get the job done" which, depending on how a person looks at it, could really be either? And if the binary drivers the manufacturers put out are the "proper drivers," then wouldn't that make Windows the "proper OS?"

Really, it sounds like you're saying something along the lines of, "if you don't like the way things are and how the GPU companies are restricting the use of their hardware through drivers, even if you run (or wish to run) a fully open system--you have no right to complain. Either use the blobs if possible, or shut up. Or upgrade to a newer model graphics card. And oh, and if those blobs are incompatible with your particular hardware and/or OS, well... tough luck. Enjoy the glitches."

Edited 2010-06-27 08:48 UTC

Reply Parent Score: 2

RE[8]: Big deal...
by vivainio on Sun 27th Jun 2010 20:17 in reply to "RE[7]: Big deal..."
vivainio Member since:
2008-12-26

Really, it sounds like you're saying something along the lines of, "if you don't like the way things are and how the GPU companies are restricting the use of their hardware through drivers, even if you run (or wish to run) a fully open system--you have no right to complain. Either use the blobs if possible, or shut up. Or upgrade to a newer model graphics card. And oh, and if those blobs are incompatible with your particular hardware and/or OS, well... tough luck. Enjoy the glitches."


Yeah, that's what I'm saying. If you insist on taking the less supported path, adjust your expectations. OTOH, I've never been seriously bothered by these artifacts, so the issue seems overblown.

As for ethical reasoning - choose your battles. NVIDIA has been able to provide the best drivers there are for Linux and have good reasons for keeping them closed; I have no problem supporting them for that. I specifically chose my laptop based on the fact that it has an nvidia card.

Reply Parent Score: 2

RE[9]: Big deal...
by UltraZelda64 on Sun 27th Jun 2010 21:42 in reply to "RE[8]: Big deal..."
UltraZelda64 Member since:
2006-12-05

Yeah, that's what I'm saying. If you insist on taking the less supported path, adjust your expectations. OTOH, I've never been seriously bothered by these artifacts, so the issue seems overblown.

As for ethical reasoning - choose your battles. NVIDIA has been able to provide the best drivers there are for Linux and have good reasons for keeping them closed; I have no problem supporting them for that. I specifically chose my laptop based on the fact that it has an nvidia card.

Well, at least you're upfront about it... that's all I'll add.

Reply Parent Score: 2

RE[8]: Big deal...
by Delgarde on Sun 27th Jun 2010 21:26 in reply to "RE[7]: Big deal..."
Delgarde Member since:
2008-08-19

What if it's running on decade-plus old hardware that doesn't have a worthy GPU for processing 3D?


If you're running on decade-old hardware, are you surprised to get 1990's performance? Surprised that you can't get the benefit of new work being done in 2010?

Reply Parent Score: 2

RE[9]: Big deal...
by UltraZelda64 on Sun 27th Jun 2010 22:03 in reply to "RE[8]: Big deal..."
UltraZelda64 Member since:
2006-12-05

"What if it's running on decade-plus old hardware that doesn't have a worthy GPU for processing 3D?


If you're running on decade-old hardware, are you surprised to get 1990's performance? Surprised that you can't get the benefit of new work being done in 2010?
"
My machine from 2001 (not quite 90s, but close) is still perfectly usable (aside from slowdown issues caused by memory/swapping, but having only 256MB will do that), and in fact, would be an excellent machine if given the RAM. Ironically, it still runs much better than the Windows Me that came preinstalled on it and Windows XP which I ran up through SP2 to get rid of the abomination of an OS that it originally shipped with. Memory back in those days was scarce and expensive, and in the case of this computer, it's still expensive.

Since when do the graphical glitches in the video driver get in the way of "real work" (other than being annoyances, just like those fancy 3D effects produced by modern compositing window managers/GPUs)? And I'm not talking about some bug that completely garbles the screen (which I don't even remember when I last saw), I'm talking about what the original poster originally was talking about--minor graphical glitches. Brought up to a "modern" spec of 512MB or better yet 1GB of memory (or even maxed out to 2GB), this machine would certainly be enough to keep going for another six years at least... and likely more. The aging GPU, though, is already "obsolete" though, by at least one driver generation if using the binary nVidia drivers.

Reply Parent Score: 2