One thing I make sure is never to use ATI cards/drivers, of course Xorg should support this but just because Xorg isnt stable with every hardware configuration doesnt mean its unstable for everyone.
So while Im not saying Thom’s opinion was somehow wrong, Its also worth keeping in mind that with well supported hardware, linux+xorg can be very stable too.
PS, the linux check is awesome, dont scare her away
the BSD’s use Xorg too, if you look at xorg’s site there is no reference to Linux specifically as being the target.
Regarding OpenGL and directX being fairly new, depends on what you consider new but openGL has been fairly standardized since ~1992. so Xorg and friends have had enough time to support this (tho its a moving target ofcourse).
the BSD’s use Xorg too, if you look at xorg’s site there is no reference to Linux specifically as being the target.
There is no explicit platform stated on their website but the code itself speaks volumes; it is GCC bound and Linux focused and worse there is a a growing reliance on HAL when HAL should already be getting put out of its misery and replaced with something better.
The lack of focus is what the problem is – anyone who dares to bring focus to the project is instantly lynched and kicked out by ‘geek rage’. This same ‘geek rage’ then turns around whining because users like me, my friends or parents aren’t willing to give Linux a ‘fair shake’.
Why the heck should anyone give Linux a ‘fair shake’ when the developers themselves aren’t willing to listen to constructive feedback and criticism? I’ve already talked about my experiences in the past – which replicate what others have found.
Regarding OpenGL and directX being fairly new, depends on what you consider new but openGL has been fairly standardized since ~1992. so Xorg and friends have had enough time to support this (tho its a moving target ofcourse).
That is the role of Mesa. There seems to be a habit in the OSS world that when something moves beyond an exact target and the requirements become exceptionally broad that all hell breaks loose. Take distributions – the applications by themselves are great, but when there is an attempt to bring it altogether in a coherent manner – it always seems to turn out crap.
Re: BSD’s using Xorg, even if its biased towards linux (with HAL), BSD’s are using Xorg, and AFAIK they are not planning on their own implementation.
Re: Mesa.
I wouldn’t say its only mesa, its more a problem of everything working together Xorg having interfaces for OpenGL to take advantage of (glx, dri etc), and whoever writes the drivers making proper use of these facilities.
There is no explicit platform stated on their website but the code itself speaks volumes; it is GCC bound and Linux focused and worse there is a a growing reliance on HAL when HAL should already be getting put out of its misery and replaced with something better.
I think things are improving, like their is now a OpenBSD-developer on the X.org-board.
Maybe the applications use HAL or the new *kit-stuff, but X.org doesn’t. I guess you could say that’s just the freedesktop-stuff that relies on HAL and *kit.
The more I think about it, their has been a lot of improvements. It’s impressive. Ever since X.org was created a lot of things started to improve and they aren’t done yet, in a few years I think things will look very different.
I really liked this podcast a lot better, it’s a lot more balanced. 🙂 The Free Software Magazine articles also helps to explain a lot.
I would perhaps even dare to say that too high reliance on Linux and its quirks is one core part of X’s problems — in a sense of “trees, forest, and seeing”.
When XOrg crashes, the applications are not required to terminate. The reason they exit — or crash, is because of XLib. XLib having been designed more than 20 years ago, assumed that an application would never outlive the server. XCB fixes this, as well as other issues from XLib.
Additionally, it isn’t the job of the X server to manage running applications. It provides a session managment extension and a library, libSM, but ultimately, this is the job of the window manager or desktop.
Exactly, X is and has been improving since X.org split from Xfree86 to improve the X11 server and bring it up to speed with modern desktop OS’s. X11 at its heart is good and replacing it would be difficult and unnecessary.
It’s mentioned but let’s be clear. X is just a interface standard. What we are really talking about here is XOrg implementation of X.
Anything that replaces XOrg, will need to provide an X interface. Wayland is an example of that approach being taken.
This doesn’t talk about some of the relevant changes in XOrg. The new mode settings and memory management are the start of this, but Gallium3D is also a possible important development. Not only because it makes writing graphics drivers easier, but it provides a abstraction removing hardware drivers from XOrg.
”
Another thing that didn’t get a lot of attention is Alan’s xf86-video-modesetting driver. It’s a dummy X11 driver that uses DRM for modesetting and Gallium3D for graphics acceleration. Because of that it’s hardware independent, meaning that all hardware that has a DRM driver and Gallium3D driver automatically works and is accelerated under X11. Very neat stuff.
All this said, when XOrg crashes, it should restart the GUI applications that where running, were they where when XOrg crashes. But when XOrg is trimmed, by moving hardware work to the kernel, it can be beefed up with this kind of thing. Some kind of XOrg panic where it temporary swaps the X server to some framebuffer thing while XOrg X server crashes and restarts.
My point is there is no talk of the future already in play. Wayland and Gallium3D are very interesting developments. A complete Nouveau driver will be interesting to watch the impact of, including Nvidia’s response. Will they fight it or chip in? All this will increase the stability of XOrg, as well as trim the code base down, making new features easier to add. All very interesting but not mention on a podcast about X. Both podcasts would lead you to believe XOrg is stagnant.
totally agree, but this isn’t something the OSNews guys know much about? – would be great if they could interview someone who is involved with some of the XOrg development.
I don’t understand why people seem to portray X.org as some stagnant chunk of code suffering from bit-rot developed back in 1984 and still hanging around for the sake of nostalgia.
Ever since X.org forked from XFree86, they’ve been doing a complete overhaul that is only about 3/4 done. So yeah, there is a lot of old depreciated sub-systems in it still, but the new developments are fantastic.
I don’t think the new Gallium3D library get’s nearly enough press, moving the 3D acceleration code out of the drivers and into a common library.
@ideasman42 : who let you out of your cage??? Get back to work on the Blender code!
I don’t understand why people seem to portray X.org as some stagnant chunk of code suffering from bit-rot developed back in 1984 and still hanging around for the sake of nostalgia.
completely agree, things have been changing rapidly, impressive even.
Thought XCB was to replace Xlib and would give you everything Xlib whilst being better. I also thought there was a bit of bad blood that people hadn’t all dropped Xlib but kept using wrappers over the top of it.
Something many people ignore with a 32bit vs 64bit os’s.
Even though the OS may be able to use more then 2gig or memory (2-8gig or so)
The applications still limited to 2gig. (since each pointer is 32bit)
I have ran into this a number of times when dealing with large 3D datasets and compositing high resolution HDR images on linux before I used a 64bit system.
I use Linux at Work, at home, at my notebook and even on my cell phone (Android). No other OS, except Haiku and Syllable (under VMWare).
But I think there are 3 things that should be fixed to make it work 100% (in my opinion).
1- GRUB. It’s just weird. I can’s install Debian on a thin client (I’m trying for months) because GRUB refuses to work properly… It may be better than Lilo, but I think it could be even better… it should just work! And sometimes, it doesn’t.
2- Sound system is too complex. Im a computer scientist and every time I try to record something, it never works the way I want at first time. The system is really powerful and I was able to do some really cool stuff, but the basic record never worked without some tweaks.
3- Graphics card. Ubuntu 9.04 destroyed all my system…
I know most of this can be solved one way or another, but anyway, those are 3 things should be more user-friendly.
Of course for Apple it’s a lot easier to make stuff work, they have like really few different type of hardware to work with… maybe we should encourage people to buy computers and notebooks that already comes with linux installed (and a good distro, that linux truly supports). Most of the problems should go away.
A good follow up. This episode in technical content and competency is much better than ep21.
Still one thing. Does OSNews do Polls? It would be good to see the stats on the people that use X and actually experience crashing. Right now there seems to be too many assumptions.
Additionally, is it possible to see the runtime of the podcasts as some players don’t download first and play and stream it instead. I’m listening and no idea when they’ll end.
Why not? It would be good to laugh my ass off Thom and the other three guys who can’t configure their desktop and want to blame some unrelated party for it.
This has been my problem too. X is the whipping boy of Linux. HAL, NetworkManager, the toolkits, the ancient SystemV service management, desktop session management, the Linux audio disaster…these are the real problems. But they are scattered and it’s hard to put all the blame on just one component. X has some issues, so everyone dogpiles on X, missing the fact that it is only a small part of the issue with Linux on the desktop. If X stayed as it is and those other bits were cleaned up and better integrated, nobody would care that 3d acceleration wasn’t quite up to par with Windows, or that the X server occasionally crashed.
I can save you guys 56 minutes of your life and summarize the conclusion the one speaker in the broadcast who spoke on behalf of X/Xorg..
It’s hard and it’s difficult and its complicated.
There it is. Now mind your own business and be patient you people who expect it all to work and your apps not crash when x crashes. We have lots of excuses and no (standard) solution in the xorg world!
From a user’s perspective, I don’t care that its a challenge. The OS web site claims we will all reach Utopian nirvana with Linux and that’s why we install it, not to much around with xorg.conf and guessing modelines
Lately I’ve been trying to make sure that the desktop will be visible if I plug in only my LCD monitor and not the LCD+CRT. It’s not easy because xorg sends too high of a refresh rate to my LCD without a CRT plugged into the other video port on the video card. When I installed F11, I had two monitors plugged but now I want only one plugged in which has xorg confused. Simple things like this ruin the experience. I know for certain there is a fix but F11 isn’t sophisticated to take a voice command “Fix it,” just yet.
(edit: and I’m certain the LCD is not telling xorg/f11 that it can display that refresh rate)
I’ve been using home computers since 1984 and have supported thousands of users on their computer and supported large and small businesses. I will summarize my feelings about x and linux:
Linux will not succeed in a big way on the desktop so don’t worry about the x/xorg issues. It’s a moot point.
From a user’s perspective, I don’t care that its a challenge. The OS web site claims we will all reach Utopian nirvana with Linux and that’s why we install it, not to much around with xorg.conf and guessing modelines
Lately I’ve been trying to make sure that the desktop will be visible if I plug in only my LCD monitor and not the LCD+CRT. It’s not easy because xorg sends too high of a refresh rate to my LCD without a CRT plugged into the other video port on the video card. When I installed F11, I had two monitors plugged but now I want only one plugged in which has xorg confused. Simple things like this ruin the experience. I know for certain there is a fix but F11 isn’t sophisticated to take a voice command “Fix it,” just yet.
(edit: and I’m certain the LCD is not telling xorg/f11 that it can display that refresh rate)
From a normal user perspective these are the problems modern distributors have to fix. They may not fix xorg alone, but they can fix annoying problems around it. I think bulletproof-X has shown whats possible. It is a nice improvement.
The so called “audio disaster” is very bad not only for users but also for the image. I still don’t know how for example Canonical could ever include it in Ubuntu 8.04. I hope that pulseaudio will be useable in Ubuntu 9.10. Btw. there are still linux distributions not using pulseaudio, e.g. Xubuntu, ZevenOS , ….
I will stick with Alsa and will test pulseaudio when Ubuntu 9.10 reaches beta and we have to consider which audio system we will use for ZevenOS 2.0 .
And we will take a deep look into it, test applications like skype , flash and all these other applications that a normal desktop user would use. (also audio recording of course)
I think that is one point Ubuntu missed to do the last releases. (e.g. xserver-xorg-video-intel and kernel incompatibility which resulted in a unusable slow graphics system )
I hope that this will improve in the next versions and that we can improve it also
jjmckay – the problem is that yes, XOrg is complicated. But the reason XOrg is doing so well these days is because they *don’t* tell people to mind their own business. I could talk about the bad old days of XFree86, but that’s getting on to another topic.
You shouldn’t have to muck around with xorg.conf now, and you REALLY shouldn’t need to mess with modelines. Every monitor made in the last 10 years has EDID, and will tell X what it supports. Unfortunately, there is still a lot of old information on the internet. If you modified your xorg.conf, it will take that as the word of god, so if you removed a monitor, it will still act as is it is there.
The best thing users can do is to make sure there are more developers on X. XOrg is getting by with such a skeleton team of devs, it is unreal.
How I would suggest getting more developers .. pointing people that know how to write code and want a challange to XOrg; make sure Google SOC money goes to X; make sure distros that have money to spend know to spend it on X; donate money; get other users together and make payouts for developers doing work on X.
jjmckay – the problem is that yes, XOrg is complicated. But the reason XOrg is doing so well these days is because they *don’t* tell people to mind their own business. I could talk about the bad old days of XFree86, but that’s getting on to another topic.
You shouldn’t have to muck around with xorg.conf now, and you REALLY shouldn’t need to mess with modelines. Every monitor made in the last 10 years has EDID, and will tell X what it supports. Unfortunately, there is still a lot of old information on the internet. If you modified your xorg.conf, it will take that as the word of god, so if you removed a monitor, it will still act as is it is there.
The best thing users can do is to make sure there are more developers on X. XOrg is getting by with such a skeleton team of devs, it is unreal.
How I would suggest getting more developers .. pointing people that know how to write code and want a challange to XOrg; make sure Google SOC money goes to X; make sure distros that have money to spend know to spend it on X; donate money; get other users together and make payouts for developers doing work on X.
Thanks for the thoughtful reply. Well that’s the thing, when I look at Firefox and how well it’s doing and how there’s no big need for users to try to get developers involved in developing Firefox because there seems to be a lot of interest because firefox inspires developers to code for it without much need for users to advocate for it. On the other hand x/xorg doesn’t seem to be inspiring people to code for it, for whatever reason I don’t know. Maybe doing graphics work is top dollar high paying work and doing a massive graphics system overhaul is not something that’s going to happen for free and by itself, unlike a new ls command.
I care, yes. But the issues I see with the basic model of how unix/linux does GUI’s seems antiquated from a local desktop perspective. In the 1970s we had the mainframe mentality and that was fine, but technology evolved. Sure, many admins still use remote x sessions and it’s efficient and elegant for that. (I love ssh2’d x sessions, hehe)
I have a basic model idea of how xorg needs to evolve or be replaced. The idea is similar to what happens with Direct3D on Windows where instead of evolving and adapting new techs to through old ideas, they just walk around the old tech and develop something new.
More specifically, lets say KDE could use an alternate rendering path, call it xorg3d v1. Then any app that assumes it’s using x is actually tunneled by the kde renderer (or compositor) through the new path (system calls, etc) that uses a local framebuffer or whatever is needed. Yeah, my knowledge is limited but I think the idea might point to what needs to happen. For all I know, this has already been done, I don’t know, but I’ve never heard of Gnome or KDE being able to use any other display route than x/xorg, except perhaps compiz but I think even that still hangs on to legacy stuff from x. I don’t know.
Just like how the KDE guys have redone it from scratch, so too does unix/linux land need to start over, for the sake of the desktop users. I don’t see why it can’t be done so that it is compatible and invisible to apps that assume x/xorg.
Note:
I had no xorg.conf when xorg broke after removing the powered-off second display (the crt) so I’m sure I didn’t break xorg. By default F11 doesn’t have an xorg.conf. I have yet to copy a skeleton type xorg.conf into the right directory so I’ve not put a new xorg.conf there yet.
Yes, your knowledge is limited. There are alternatives to X (or have been), some of them still around, like DirectFB. Many GTK+ applications can supposedly run directly on the framebuffer. But you know what? X is still the best solution around. It actually does work pretty well, so much so that no alternative has ever been able to show up as more than a blip on the radar. Even now that X is basically opening up video mode setting and direct rendering to the rest of the world with KMS and related changes, I see no big projects that are making use of those to build a new, more efficient rendering system. If a new rendering system is so self-evident and so very much needed, then surely enough people would be working on it. I mean, other forks and replacement projects have proceeded just fine (GNOME came about to deal with the licensing problems of Qt and now it is arguable the stronger of the two desktops).
So the reality is likely that X really is fine in most regards. It needs some polish and a few more old bits need to go away. No doubt about it. But as a rendering system, it does the job just fine. And, as I surmise, the real problem is in the toolkits and the DEs, which fail to provide an integrated and stable platform for applications.
@jabjoe, XCB’s the one, I didnt recall the name, but on the other hand. How can you convince projects to switch that are already using XLib? – It breaks compat with older unix’s, for a nicer API (but still requires rewriting parts of the app that currently work). – A shame really.
@jjmckay, agree that this is more complicated then people give it credit for, while I didnt ever do Xorg development, my experience is that often making something simple for the user can end up being fairly complicated for the developer. When hardware compatibility is involved its worse.
@diego, maybe they have no idea but theirs is more of a user perception, nothing wrong with this but when outsiders (non Xorg core-devs in this case) speculate on a projects direction, I find it almost laughable.
Maybe the problem is the X11 api, maybe its the drivers, maybe its sloppy code??? – speculating on this without some understanding the xorg internals is stupid, better interview someone who knows what their talking about.
@diego, maybe they have no idea but theirs is more of a user perception, nothing wrong with this
That’s what I mean, nothing wrong with constructive critics, in fact, that’s good.
but when outsiders (non Xorg core-devs in this case) speculate on a projects direction, I find it almost laughable.
Agreed, that’s what I tried to say.
Maybe the problem is the X11 api, maybe its the drivers, maybe its sloppy code??? – speculating on this without some understanding the xorg internals is stupid, better interview someone who knows what their talking about.
@jabjoe, XCB’s the one, I didnt recall the name, but on the other hand. How can you convince projects to switch that are already using XLib? – It breaks compat with older unix’s, for a nicer API (but still requires rewriting parts of the app that currently work). – A shame really.
I thought it was compatible, but on checking it’s not. But that’s not as a big deal as it could be:
Xlib/XCB provides application binary interface compatibility with both Xlib and XCB, providing an incremental porting path. Xlib/XCB uses the protocol layer of Xlib, but replaces the Xlib transport layer with XCB, and provides access to the underlying XCB connection for direct use of XCB. Most distributions nowadays use Xlib/XCB for their libX11, because by opening a single connection to the X server this allows to mix usage of XCB and Xlib in an application and the libraries it uses.
Most applications aren’t link directly against XLib anyway, but to QT/GTK these could/should be changed to take advantage of XCB, especially since Xlib/XCB is below anyway.
From what I’m skimmed, other Unixs, also have XCB.
In some ways its bad that XCB hasn’t completely succeeded or completely failed. But with XLib/XCB as libX11 and everyone using widget kits anyway, it’s not too bad, as long as the widget kits get the most out of what is below.
For Blender3D we use X11 directly but only to setup the OpenGL context, pass on events & copy/paste, there are about ~2000 lines of xlib code in total.
We also have some crazy guys that still support Irix (and solaris but at least thats not EOL’d yet), not sure if this is supported without static linking, would need to look into that too.
While using Xlib is a total PITA to work with, its rare that it needs updating so theres not much incentive to switch.
The only thing that would make me interested in doing this, is if it was faster or somehow gave a better user experience.
The point of XCB is that it doesn’t do all sorts of behind-the-scenes caching and round-tripping that Xlib does. It gives you something that is a bit more on the raw side as far as accessing the underlying protocol. This can be used to reduce latency and unnecessary round-trips. The API isn’t needlessly complex and it is not intended to be a application-writer’s API. Rather, it is intended to be used by toolkits and other mid-level libraries and frameworks which will then present a nice API to application developers.
It was great to finally hear Tess, the Linux chick, say something about the subject. The level of the debate is starting to increase now.
I think the possibility that making X more robust to driver flaws could have a large efficiency cost was very little discussed. It would also have been great to hear more about what actually happens when a driver “crashes”. What is wrong in these programs? How could this be avoided? Do we need some more advanced driver API with garbage collection or whatever?
And how easy it is for a “normal person” to start making some sort of driver program for Windows so we can really test the security devices that supposedly exist there today? (For example, we could make some performance tests of these garbage collection or whatever…) We can’t just simply trust them, we must look at the guts of this windows driver firewall somehow.
It’s very hard to compare the Windows and Linux drivers world. How can we be sure that it is really the newer Micros~1 systems that are more robust, but not the drivers that improved? Are they still collecting that data, and does it log when a “near-crash” happens?
*** [From here on, just a long train of standard thoughts about free software I wrote while I was thinking about something better to do. Apologies for the long posting.]
Now regarding the “human” aspects of the controversy. I think most of the flamewar around Thom’s sayings comes from his insistence to keep a “customer” position, complaining about free software as if it were just another program he is using. I understand that for the end user it makes no difference: he just wants to see things working. But it’s just not fair to complain the same way.
Actually, I think he (and many others also) feels even more allowed to criticize free projects than closed ones, maybe because of how they are much closer to the users community. But just as some people will feel more comfortable to trow stones at the free “train wrecks”, so will defenders be more passionate. You just won’t ever hear proprietary software users and specially developers come up to a hot debate like this. Companies prefer to write polite and confusing letters, when they do anything at all, while their users will always just wait to see what the company will do, while saying “their stuff Just Works for me, so shut up”.
The best analogy to make here, to analyze the fairness of complaints, is to think about food (instead of cars). If you go to McDonalds, and they sell you a bad Big Mac, or a rotten McApple, you stomp your feet, scream with the manager, and then you get a new food (hopefully in good shape) plus some vanilla ice cream. Or maybe your money back. There are guarantees, because it is a service some company is providing you. You can also make all these complaints about how things should be done, because it is your money that is supporting the company’s existence, after all.
Free software, on the other hand, is that soup in the church, or maybe an apple ( ) pie your neighbor made and gave you a piece. You just shouldn’t complain loudly with your neighbor who gave you a free piece of pie if it so happens that they cooked it using some technique that you didn’t like. It could even be something serious: suppose you find out this neighbor of yours doesn’t wash his hands, or the apples before cooking. This may crash your system for good. What you do? Eat his pie while screaming at him that he should be cooking properly? No, you don’t actually eat the damned thing, you just say (lie) “oh, thanks, but no, I just ate something else”. He might even be offended if you accept and complain, and take the pie back from you. (This last act is not possible with free software.)
Of course you are right to complain about a food that has been badly done, but you just don’t talk the same way as if you were in a restaurant. It was charity, or “voluntary work” if you will, and you should always be feeling a bit humble. Free software will always have this characteristic, even if it was developed by companies or backed up by humane millionaires.
So, please, do not complain about Linux and X.org or whatever as some dissatisfied customer who thinks he deserved better. The maximum you can say is “oh, it was too good to be true”. I am sure you could have used your blunt instrument in a more constructive way.
BTW, I don’t think open source inherently implies better or more secure software, so don’t throw me in that bag, please. The freedom of free software is by far the characteristic I care for the most. The second one is that FLOSS usually work in ways I prefer, but this is more or less a coincidence.
I would keep using Linux even if it were less stable. I have done it in the past, actually, going through a suffering similar to the one described in your first article… But I never felt much that way, because I never considered people wanted me to be using Linux, like companies want you to choose them. It was me who wanted to use it, and I always felt privileged to be submitted to all those crashes and data loss. I wrote my masters dissertation in a Debian machine, but fortunately had few system problems at that time. I had the machine for some time, and avoided major upgrades until I finish the job.
I stopped trying to convince people to use FLOSS some time ago, because to use and develop free software are somewhat acts of heroism. And you don’t ask people to do acts of heroism, you just explain them the situation and wait to see what they decide. If a soldier goes on a suicide mission, he surely won’t hear his sergeant screaming at him. This is the same politeness you should generally have when criticizing free software. Or else you will see that numerous angry people that are usually dismissed as “zealots”.
***
Linux is not a puny competitor to other mighty proprietary operational systems, trying hard to emulate the quality of them, or to offer their latest features. It is a project that exists in another dimension. To switch from one platform to the other is a big step, and it’s much different from moving from, for example (or lack of alternative), apple to microsfot.
You can freely say “wouldn’t it be great if X were somehow more robust to video driver failures?” That is an interesting thought, and I liked to hear all about this subject. But wish we could have skipped the whole “I am giving up Linux and moving back to Windows” testimonial, because it touches many subjects other than the technicalities of how the programs work, etc. This is much more a of a wetware and less of a {hard,soft,firm}ware problem.
Next time, please, just avoid allowing people to call you “Traitor!! Reactionary!!”
An extremely lovely written comment, and I love your analogy over food and that which is given free (or as you put it—by charity).
That can be an overly romantic view though of the free software world. Linux is backed by major corporations that contribute full time developers.
I’ll give you another analogy: Homemade Jam bought from supermarkets. It has a picture of a thatched-roofed cottage on the label and it has some scrumpy name like “Ma’s Traditional Homemade Preserves Co.†and the lid is in a gingham pattern to resemble the old days of stretching fabric over the top. The reality is that it is made in a highly organised factory with paid workers and ’Ma is actually the strict bullwhip-carrying west-German factory manager.
Linux and its various components power 60% of the world’s webservers. It powers countless routers and networking devices. Millions of dollars are pumped into it annually.
I think that with all that investment, that it should and can be held accountable, even if you don’t have to pay up front for the product.
Back to the food analogy—instead, this time imagine that you are a McDonald’s shareholder, or franchise owner. You have invested your money into McDonalds, and you see your store, or any store, under performing. This is a threat to your investment and you have every right to hold them accountable.
In the world of big business, open source can’t just shrug off the responsibility when it feels like it, the industry has changed forever.
Big business can put up money to get the work it requires to be done, done.
No the problem is home users, and then the neighbor’s apple pie is right. Why this works at all is because enough of a percentage of users are programmers, that have the time, skills, and are interested, to fix bugs that come up, and fill in feature the holes they can. This is complemented by what big business is doing, and all the shades of gray between the two extremes. The GPL stickiness holds it all together.
There are many things that could speed up development only home users care about. A big one I think is hardware companies playing ball, providing spec, if not actual engineering time. Binary blobs are a fudge, it’s not really taking part, its being at the party but sitting in the corner talking to no one and being antisocial. They have pie, but no one else is allowed any, it’s theirs only. It’s why projects like Nouveau are so important, it’s the process of kicking out those who don’t want to be there anyway.
Linux and its various components power 60% of the world’s webservers. It powers countless routers and networking devices. Millions of dollars are pumped into it annually.
I think that with all that investment, that it should and can be held accountable, even if you don’t have to pay up front for the product.
I think you answered your own question. Servers and routers don’t need a desktop so the money doesn’t go there.
I love desktop Linux. I love its spirit, it quirks, and its uneven quality. I love its universe of choices. I also love the fact that it is an unpopular niche curiosity.
I’m am not a zealot though. The Amiga burned up my zealotry. The day that Linux breaks 10% market share on the desktop is the day I move on to BSD or OpenSolaris or something else. I say that because in order for Linux to achieve that kind of popularity, it has to become a slicked-up and focus-grouped monolith with its universe of choices left on the cutting room floor.
making x crash-save is a sole thing between the x-server and the x-libraries on the application-side
all the lib has to do is cache the necessary information about windows and views (or whatever is used by x), restart the x-server when it’s down, hand over the cached information, and make a full redraw of the window
Hi, during the conversation of X stability it seemed to be a given that Xorg is unstable for advanced graphics.
Id like to say that this isnt my experience, since I found hardware that worked well with linux and made careful decisions with upgrades.
I often use blender3D on linux for weeks without a reboot with an NVidia8800, (normally reboot for kernel updates).
I have also setup a studio ~14 or so PCs with NVidias.
My dad and my girlfriend also use ubuntu and I dont hear them complaining of crashes.
Our studio
http://www.bigbuckbunny.org/index.php/just-another-manic-friday/
One thing I make sure is never to use ATI cards/drivers, of course Xorg should support this but just because Xorg isnt stable with every hardware configuration doesnt mean its unstable for everyone.
So while Im not saying Thom’s opinion was somehow wrong, Its also worth keeping in mind that with well supported hardware, linux+xorg can be very stable too.
PS, the linux check is awesome, dont scare her away
Edited 2009-08-24 14:47 UTC
the BSD’s use Xorg too, if you look at xorg’s site there is no reference to Linux specifically as being the target.
Regarding OpenGL and directX being fairly new, depends on what you consider new but openGL has been fairly standardized since ~1992. so Xorg and friends have had enough time to support this (tho its a moving target ofcourse).
There is no explicit platform stated on their website but the code itself speaks volumes; it is GCC bound and Linux focused and worse there is a a growing reliance on HAL when HAL should already be getting put out of its misery and replaced with something better.
The lack of focus is what the problem is – anyone who dares to bring focus to the project is instantly lynched and kicked out by ‘geek rage’. This same ‘geek rage’ then turns around whining because users like me, my friends or parents aren’t willing to give Linux a ‘fair shake’.
Why the heck should anyone give Linux a ‘fair shake’ when the developers themselves aren’t willing to listen to constructive feedback and criticism? I’ve already talked about my experiences in the past – which replicate what others have found.
That is the role of Mesa. There seems to be a habit in the OSS world that when something moves beyond an exact target and the requirements become exceptionally broad that all hell breaks loose. Take distributions – the applications by themselves are great, but when there is an attempt to bring it altogether in a coherent manner – it always seems to turn out crap.
Edited 2009-08-24 16:23 UTC
Re: BSD’s using Xorg, even if its biased towards linux (with HAL), BSD’s are using Xorg, and AFAIK they are not planning on their own implementation.
Re: Mesa.
I wouldn’t say its only mesa, its more a problem of everything working together Xorg having interfaces for OpenGL to take advantage of (glx, dri etc), and whoever writes the drivers making proper use of these facilities.
Edited 2009-08-24 18:34 UTC
There is no explicit platform stated on their website but the code itself speaks volumes; it is GCC bound and Linux focused and worse there is a a growing reliance on HAL when HAL should already be getting put out of its misery and replaced with something better.
I think things are improving, like their is now a OpenBSD-developer on the X.org-board.
Maybe the applications use HAL or the new *kit-stuff, but X.org doesn’t. I guess you could say that’s just the freedesktop-stuff that relies on HAL and *kit.
The more I think about it, their has been a lot of improvements. It’s impressive. Ever since X.org was created a lot of things started to improve and they aren’t done yet, in a few years I think things will look very different.
I really liked this podcast a lot better, it’s a lot more balanced. 🙂 The Free Software Magazine articles also helps to explain a lot.
Once again you nailed it, kaiwai.
I would perhaps even dare to say that too high reliance on Linux and its quirks is one core part of X’s problems — in a sense of “trees, forest, and seeing”.
Edited 2009-08-25 10:23 UTC
When XOrg crashes, the applications are not required to terminate. The reason they exit — or crash, is because of XLib. XLib having been designed more than 20 years ago, assumed that an application would never outlive the server. XCB fixes this, as well as other issues from XLib.
Additionally, it isn’t the job of the X server to manage running applications. It provides a session managment extension and a library, libSM, but ultimately, this is the job of the window manager or desktop.
Exactly, X is and has been improving since X.org split from Xfree86 to improve the X11 server and bring it up to speed with modern desktop OS’s. X11 at its heart is good and replacing it would be difficult and unnecessary.
It’s mentioned but let’s be clear. X is just a interface standard. What we are really talking about here is XOrg implementation of X.
Anything that replaces XOrg, will need to provide an X interface. Wayland is an example of that approach being taken.
This doesn’t talk about some of the relevant changes in XOrg. The new mode settings and memory management are the start of this, but Gallium3D is also a possible important development. Not only because it makes writing graphics drivers easier, but it provides a abstraction removing hardware drivers from XOrg.
”
Another thing that didn’t get a lot of attention is Alan’s xf86-video-modesetting driver. It’s a dummy X11 driver that uses DRM for modesetting and Gallium3D for graphics acceleration. Because of that it’s hardware independent, meaning that all hardware that has a DRM driver and Gallium3D driver automatically works and is accelerated under X11. Very neat stuff.
”
http://zrusin.blogspot.com/2009/02/latest-changes.html
All this said, when XOrg crashes, it should restart the GUI applications that where running, were they where when XOrg crashes. But when XOrg is trimmed, by moving hardware work to the kernel, it can be beefed up with this kind of thing. Some kind of XOrg panic where it temporary swaps the X server to some framebuffer thing while XOrg X server crashes and restarts.
Not sure this was an issue, The linux check does mention this – “X11 is a standard you can get a big book on”.
agree some framebuffer fallback would be good to have though as mentioned crashing Xorg isnt something I have a problem with.
My point is there is no talk of the future already in play. Wayland and Gallium3D are very interesting developments. A complete Nouveau driver will be interesting to watch the impact of, including Nvidia’s response. Will they fight it or chip in? All this will increase the stability of XOrg, as well as trim the code base down, making new features easier to add. All very interesting but not mention on a podcast about X. Both podcasts would lead you to believe XOrg is stagnant.
totally agree, but this isn’t something the OSNews guys know much about? – would be great if they could interview someone who is involved with some of the XOrg development.
Edited 2009-08-24 16:55 UTC
+1
I don’t understand why people seem to portray X.org as some stagnant chunk of code suffering from bit-rot developed back in 1984 and still hanging around for the sake of nostalgia.
Ever since X.org forked from XFree86, they’ve been doing a complete overhaul that is only about 3/4 done. So yeah, there is a lot of old depreciated sub-systems in it still, but the new developments are fantastic.
I don’t think the new Gallium3D library get’s nearly enough press, moving the 3D acceleration code out of the drivers and into a common library.
@ideasman42 : who let you out of your cage??? Get back to work on the Blender code!
I don’t understand why people seem to portray X.org as some stagnant chunk of code suffering from bit-rot developed back in 1984 and still hanging around for the sake of nostalgia.
completely agree, things have been changing rapidly, impressive even.
Probably the impressions Xorg is decrepit is Xlib, which IS kindof old and crusty.
It was funny because I was doing some blender/Xlib stuff (blender uses Xlib directly, no qt/sdl/gtk).
And I asked the guys in #Xorg if Xlib was a low level lib that was really powerful, in a similar way to how some low level languages can be fast.
And they are like “No, its just verbose and crap”
– so it doesn’t seem like theres much benefit from Xlib, aside from your apps being able to run over a network.
Thought XCB was to replace Xlib and would give you everything Xlib whilst being better. I also thought there was a bit of bad blood that people hadn’t all dropped Xlib but kept using wrappers over the top of it.
“I don’t think the new Gallium3D library get’s nearly enough press”
It has been promised for years and years. It will only get press when it starts being used by people who aren’t developers.
Things are moving:
http://www.phoronix.com/scan.php?page=search&q=Gallium3D
Plus Fedora 11 uses nouveau (but I bet 3D stuff is turned off, for now)
I use Gallium3d on my Radeon OS drivers. It is being used.
Her name is Tess.
Something many people ignore with a 32bit vs 64bit os’s.
Even though the OS may be able to use more then 2gig or memory (2-8gig or so)
The applications still limited to 2gig. (since each pointer is 32bit)
I have ran into this a number of times when dealing with large 3D datasets and compositing high resolution HDR images on linux before I used a 64bit system.
Edited 2009-08-24 15:48 UTC
I use Linux at Work, at home, at my notebook and even on my cell phone (Android). No other OS, except Haiku and Syllable (under VMWare).
But I think there are 3 things that should be fixed to make it work 100% (in my opinion).
1- GRUB. It’s just weird. I can’s install Debian on a thin client (I’m trying for months) because GRUB refuses to work properly… It may be better than Lilo, but I think it could be even better… it should just work! And sometimes, it doesn’t.
2- Sound system is too complex. Im a computer scientist and every time I try to record something, it never works the way I want at first time. The system is really powerful and I was able to do some really cool stuff, but the basic record never worked without some tweaks.
3- Graphics card. Ubuntu 9.04 destroyed all my system…
I know most of this can be solved one way or another, but anyway, those are 3 things should be more user-friendly.
Of course for Apple it’s a lot easier to make stuff work, they have like really few different type of hardware to work with… maybe we should encourage people to buy computers and notebooks that already comes with linux installed (and a good distro, that linux truly supports). Most of the problems should go away.
yay Tess is back! Interesting side-comments she had on Apple.
A good follow up. This episode in technical content and competency is much better than ep21.
Still one thing. Does OSNews do Polls? It would be good to see the stats on the people that use X and actually experience crashing. Right now there seems to be too many assumptions.
Additionally, is it possible to see the runtime of the podcasts as some players don’t download first and play and stream it instead. I’m listening and no idea when they’ll end.
Edited 2009-08-24 22:45 UTC
PLEASE don’t do polls. Unsolicited polls mean nothing and the internet variety mean less.
Why not? It would be good to laugh my ass off Thom and the other three guys who can’t configure their desktop and want to blame some unrelated party for it.
Edited 2009-08-25 10:32 UTC
http://www.thedailyshow.com/watch/mon-august-17-2009/poll-bearers
These guys commenting about how bad X is don’t have any idea of what they are talking about.
This has been my problem too. X is the whipping boy of Linux. HAL, NetworkManager, the toolkits, the ancient SystemV service management, desktop session management, the Linux audio disaster…these are the real problems. But they are scattered and it’s hard to put all the blame on just one component. X has some issues, so everyone dogpiles on X, missing the fact that it is only a small part of the issue with Linux on the desktop. If X stayed as it is and those other bits were cleaned up and better integrated, nobody would care that 3d acceleration wasn’t quite up to par with Windows, or that the X server occasionally crashed.
I can save you guys 56 minutes of your life and summarize the conclusion the one speaker in the broadcast who spoke on behalf of X/Xorg..
It’s hard and it’s difficult and its complicated.
There it is. Now mind your own business and be patient you people who expect it all to work and your apps not crash when x crashes. We have lots of excuses and no (standard) solution in the xorg world!
From a user’s perspective, I don’t care that its a challenge. The OS web site claims we will all reach Utopian nirvana with Linux and that’s why we install it, not to much around with xorg.conf and guessing modelines
Lately I’ve been trying to make sure that the desktop will be visible if I plug in only my LCD monitor and not the LCD+CRT. It’s not easy because xorg sends too high of a refresh rate to my LCD without a CRT plugged into the other video port on the video card. When I installed F11, I had two monitors plugged but now I want only one plugged in which has xorg confused. Simple things like this ruin the experience. I know for certain there is a fix but F11 isn’t sophisticated to take a voice command “Fix it,” just yet.
(edit: and I’m certain the LCD is not telling xorg/f11 that it can display that refresh rate)
Edited 2009-08-25 06:19 UTC
I’ve been using home computers since 1984 and have supported thousands of users on their computer and supported large and small businesses. I will summarize my feelings about x and linux:
Linux will not succeed in a big way on the desktop so don’t worry about the x/xorg issues. It’s a moot point.
From a normal user perspective these are the problems modern distributors have to fix. They may not fix xorg alone, but they can fix annoying problems around it. I think bulletproof-X has shown whats possible. It is a nice improvement.
The so called “audio disaster” is very bad not only for users but also for the image. I still don’t know how for example Canonical could ever include it in Ubuntu 8.04. I hope that pulseaudio will be useable in Ubuntu 9.10. Btw. there are still linux distributions not using pulseaudio, e.g. Xubuntu, ZevenOS , ….
I will stick with Alsa and will test pulseaudio when Ubuntu 9.10 reaches beta and we have to consider which audio system we will use for ZevenOS 2.0 .
And we will take a deep look into it, test applications like skype , flash and all these other applications that a normal desktop user would use. (also audio recording of course)
I think that is one point Ubuntu missed to do the last releases. (e.g. xserver-xorg-video-intel and kernel incompatibility which resulted in a unusable slow graphics system )
I hope that this will improve in the next versions and that we can improve it also
Edited 2009-08-25 09:32 UTC
jjmckay – the problem is that yes, XOrg is complicated. But the reason XOrg is doing so well these days is because they *don’t* tell people to mind their own business. I could talk about the bad old days of XFree86, but that’s getting on to another topic.
You shouldn’t have to muck around with xorg.conf now, and you REALLY shouldn’t need to mess with modelines. Every monitor made in the last 10 years has EDID, and will tell X what it supports. Unfortunately, there is still a lot of old information on the internet. If you modified your xorg.conf, it will take that as the word of god, so if you removed a monitor, it will still act as is it is there.
The best thing users can do is to make sure there are more developers on X. XOrg is getting by with such a skeleton team of devs, it is unreal.
How I would suggest getting more developers .. pointing people that know how to write code and want a challange to XOrg; make sure Google SOC money goes to X; make sure distros that have money to spend know to spend it on X; donate money; get other users together and make payouts for developers doing work on X.
Thanks for the thoughtful reply. Well that’s the thing, when I look at Firefox and how well it’s doing and how there’s no big need for users to try to get developers involved in developing Firefox because there seems to be a lot of interest because firefox inspires developers to code for it without much need for users to advocate for it. On the other hand x/xorg doesn’t seem to be inspiring people to code for it, for whatever reason I don’t know. Maybe doing graphics work is top dollar high paying work and doing a massive graphics system overhaul is not something that’s going to happen for free and by itself, unlike a new ls command.
I care, yes. But the issues I see with the basic model of how unix/linux does GUI’s seems antiquated from a local desktop perspective. In the 1970s we had the mainframe mentality and that was fine, but technology evolved. Sure, many admins still use remote x sessions and it’s efficient and elegant for that. (I love ssh2’d x sessions, hehe)
I have a basic model idea of how xorg needs to evolve or be replaced. The idea is similar to what happens with Direct3D on Windows where instead of evolving and adapting new techs to through old ideas, they just walk around the old tech and develop something new.
More specifically, lets say KDE could use an alternate rendering path, call it xorg3d v1. Then any app that assumes it’s using x is actually tunneled by the kde renderer (or compositor) through the new path (system calls, etc) that uses a local framebuffer or whatever is needed. Yeah, my knowledge is limited but I think the idea might point to what needs to happen. For all I know, this has already been done, I don’t know, but I’ve never heard of Gnome or KDE being able to use any other display route than x/xorg, except perhaps compiz but I think even that still hangs on to legacy stuff from x. I don’t know.
Just like how the KDE guys have redone it from scratch, so too does unix/linux land need to start over, for the sake of the desktop users. I don’t see why it can’t be done so that it is compatible and invisible to apps that assume x/xorg.
Note:
I had no xorg.conf when xorg broke after removing the powered-off second display (the crt) so I’m sure I didn’t break xorg. By default F11 doesn’t have an xorg.conf. I have yet to copy a skeleton type xorg.conf into the right directory so I’ve not put a new xorg.conf there yet.
Yes, your knowledge is limited. There are alternatives to X (or have been), some of them still around, like DirectFB. Many GTK+ applications can supposedly run directly on the framebuffer. But you know what? X is still the best solution around. It actually does work pretty well, so much so that no alternative has ever been able to show up as more than a blip on the radar. Even now that X is basically opening up video mode setting and direct rendering to the rest of the world with KMS and related changes, I see no big projects that are making use of those to build a new, more efficient rendering system. If a new rendering system is so self-evident and so very much needed, then surely enough people would be working on it. I mean, other forks and replacement projects have proceeded just fine (GNOME came about to deal with the licensing problems of Qt and now it is arguable the stronger of the two desktops).
So the reality is likely that X really is fine in most regards. It needs some polish and a few more old bits need to go away. No doubt about it. But as a rendering system, it does the job just fine. And, as I surmise, the real problem is in the toolkits and the DEs, which fail to provide an integrated and stable platform for applications.
@jabjoe, XCB’s the one, I didnt recall the name, but on the other hand. How can you convince projects to switch that are already using XLib? – It breaks compat with older unix’s, for a nicer API (but still requires rewriting parts of the app that currently work). – A shame really.
@jjmckay, agree that this is more complicated then people give it credit for, while I didnt ever do Xorg development, my experience is that often making something simple for the user can end up being fairly complicated for the developer. When hardware compatibility is involved its worse.
@diego, maybe they have no idea but theirs is more of a user perception, nothing wrong with this but when outsiders (non Xorg core-devs in this case) speculate on a projects direction, I find it almost laughable.
Maybe the problem is the X11 api, maybe its the drivers, maybe its sloppy code??? – speculating on this without some understanding the xorg internals is stupid, better interview someone who knows what their talking about.
Edited 2009-08-25 06:47 UTC
That’s what I mean, nothing wrong with constructive critics, in fact, that’s good.
Agreed, that’s what I tried to say.
Exactly.
I thought it was compatible, but on checking it’s not. But that’s not as a big deal as it could be:
Xlib/XCB provides application binary interface compatibility with both Xlib and XCB, providing an incremental porting path. Xlib/XCB uses the protocol layer of Xlib, but replaces the Xlib transport layer with XCB, and provides access to the underlying XCB connection for direct use of XCB. Most distributions nowadays use Xlib/XCB for their libX11, because by opening a single connection to the X server this allows to mix usage of XCB and Xlib in an application and the libraries it uses.
http://en.wikipedia.org/wiki/Xcb
Most applications aren’t link directly against XLib anyway, but to QT/GTK these could/should be changed to take advantage of XCB, especially since Xlib/XCB is below anyway.
From what I’m skimmed, other Unixs, also have XCB.
In some ways its bad that XCB hasn’t completely succeeded or completely failed. But with XLib/XCB as libX11 and everyone using widget kits anyway, it’s not too bad, as long as the widget kits get the most out of what is below.
But maybe I’m missing some important facts.
Thanks for looking into this,
For Blender3D we use X11 directly but only to setup the OpenGL context, pass on events & copy/paste, there are about ~2000 lines of xlib code in total.
We also have some crazy guys that still support Irix (and solaris but at least thats not EOL’d yet), not sure if this is supported without static linking, would need to look into that too.
While using Xlib is a total PITA to work with, its rare that it needs updating so theres not much incentive to switch.
The only thing that would make me interested in doing this, is if it was faster or somehow gave a better user experience.
After reading:
http://xcb.freedesktop.org/opengl/
It does sound like there isn’t much point for you. Bar the nicer API. Maybe I’m missing some win about XCB events, but I’m not seeing it.
The point of XCB is that it doesn’t do all sorts of behind-the-scenes caching and round-tripping that Xlib does. It gives you something that is a bit more on the raw side as far as accessing the underlying protocol. This can be used to reduce latency and unnecessary round-trips. The API isn’t needlessly complex and it is not intended to be a application-writer’s API. Rather, it is intended to be used by toolkits and other mid-level libraries and frameworks which will then present a nice API to application developers.
It was great to finally hear Tess, the Linux chick, say something about the subject. The level of the debate is starting to increase now.
I think the possibility that making X more robust to driver flaws could have a large efficiency cost was very little discussed. It would also have been great to hear more about what actually happens when a driver “crashes”. What is wrong in these programs? How could this be avoided? Do we need some more advanced driver API with garbage collection or whatever?
And how easy it is for a “normal person” to start making some sort of driver program for Windows so we can really test the security devices that supposedly exist there today? (For example, we could make some performance tests of these garbage collection or whatever…) We can’t just simply trust them, we must look at the guts of this windows driver firewall somehow.
It’s very hard to compare the Windows and Linux drivers world. How can we be sure that it is really the newer Micros~1 systems that are more robust, but not the drivers that improved? Are they still collecting that data, and does it log when a “near-crash” happens?
*** [From here on, just a long train of standard thoughts about free software I wrote while I was thinking about something better to do. Apologies for the long posting.]
Now regarding the “human” aspects of the controversy. I think most of the flamewar around Thom’s sayings comes from his insistence to keep a “customer” position, complaining about free software as if it were just another program he is using. I understand that for the end user it makes no difference: he just wants to see things working. But it’s just not fair to complain the same way.
Actually, I think he (and many others also) feels even more allowed to criticize free projects than closed ones, maybe because of how they are much closer to the users community. But just as some people will feel more comfortable to trow stones at the free “train wrecks”, so will defenders be more passionate. You just won’t ever hear proprietary software users and specially developers come up to a hot debate like this. Companies prefer to write polite and confusing letters, when they do anything at all, while their users will always just wait to see what the company will do, while saying “their stuff Just Works for me, so shut up”.
The best analogy to make here, to analyze the fairness of complaints, is to think about food (instead of cars). If you go to McDonalds, and they sell you a bad Big Mac, or a rotten McApple, you stomp your feet, scream with the manager, and then you get a new food (hopefully in good shape) plus some vanilla ice cream. Or maybe your money back. There are guarantees, because it is a service some company is providing you. You can also make all these complaints about how things should be done, because it is your money that is supporting the company’s existence, after all.
Free software, on the other hand, is that soup in the church, or maybe an apple ( ) pie your neighbor made and gave you a piece. You just shouldn’t complain loudly with your neighbor who gave you a free piece of pie if it so happens that they cooked it using some technique that you didn’t like. It could even be something serious: suppose you find out this neighbor of yours doesn’t wash his hands, or the apples before cooking. This may crash your system for good. What you do? Eat his pie while screaming at him that he should be cooking properly? No, you don’t actually eat the damned thing, you just say (lie) “oh, thanks, but no, I just ate something else”. He might even be offended if you accept and complain, and take the pie back from you. (This last act is not possible with free software.)
Of course you are right to complain about a food that has been badly done, but you just don’t talk the same way as if you were in a restaurant. It was charity, or “voluntary work” if you will, and you should always be feeling a bit humble. Free software will always have this characteristic, even if it was developed by companies or backed up by humane millionaires.
So, please, do not complain about Linux and X.org or whatever as some dissatisfied customer who thinks he deserved better. The maximum you can say is “oh, it was too good to be true”. I am sure you could have used your blunt instrument in a more constructive way.
BTW, I don’t think open source inherently implies better or more secure software, so don’t throw me in that bag, please. The freedom of free software is by far the characteristic I care for the most. The second one is that FLOSS usually work in ways I prefer, but this is more or less a coincidence.
I would keep using Linux even if it were less stable. I have done it in the past, actually, going through a suffering similar to the one described in your first article… But I never felt much that way, because I never considered people wanted me to be using Linux, like companies want you to choose them. It was me who wanted to use it, and I always felt privileged to be submitted to all those crashes and data loss. I wrote my masters dissertation in a Debian machine, but fortunately had few system problems at that time. I had the machine for some time, and avoided major upgrades until I finish the job.
I stopped trying to convince people to use FLOSS some time ago, because to use and develop free software are somewhat acts of heroism. And you don’t ask people to do acts of heroism, you just explain them the situation and wait to see what they decide. If a soldier goes on a suicide mission, he surely won’t hear his sergeant screaming at him. This is the same politeness you should generally have when criticizing free software. Or else you will see that numerous angry people that are usually dismissed as “zealots”.
***
Linux is not a puny competitor to other mighty proprietary operational systems, trying hard to emulate the quality of them, or to offer their latest features. It is a project that exists in another dimension. To switch from one platform to the other is a big step, and it’s much different from moving from, for example (or lack of alternative), apple to microsfot.
You can freely say “wouldn’t it be great if X were somehow more robust to video driver failures?” That is an interesting thought, and I liked to hear all about this subject. But wish we could have skipped the whole “I am giving up Linux and moving back to Windows” testimonial, because it touches many subjects other than the technicalities of how the programs work, etc. This is much more a of a wetware and less of a {hard,soft,firm}ware problem.
Next time, please, just avoid allowing people to call you “Traitor!! Reactionary!!”
Edited 2009-08-26 03:00 UTC
An extremely lovely written comment, and I love your analogy over food and that which is given free (or as you put it—by charity).
That can be an overly romantic view though of the free software world. Linux is backed by major corporations that contribute full time developers.
I’ll give you another analogy: Homemade Jam bought from supermarkets. It has a picture of a thatched-roofed cottage on the label and it has some scrumpy name like “Ma’s Traditional Homemade Preserves Co.†and the lid is in a gingham pattern to resemble the old days of stretching fabric over the top. The reality is that it is made in a highly organised factory with paid workers and ’Ma is actually the strict bullwhip-carrying west-German factory manager.
Linux and its various components power 60% of the world’s webservers. It powers countless routers and networking devices. Millions of dollars are pumped into it annually.
I think that with all that investment, that it should and can be held accountable, even if you don’t have to pay up front for the product.
Back to the food analogy—instead, this time imagine that you are a McDonald’s shareholder, or franchise owner. You have invested your money into McDonalds, and you see your store, or any store, under performing. This is a threat to your investment and you have every right to hold them accountable.
In the world of big business, open source can’t just shrug off the responsibility when it feels like it, the industry has changed forever.
Big business can put up money to get the work it requires to be done, done.
No the problem is home users, and then the neighbor’s apple pie is right. Why this works at all is because enough of a percentage of users are programmers, that have the time, skills, and are interested, to fix bugs that come up, and fill in feature the holes they can. This is complemented by what big business is doing, and all the shades of gray between the two extremes. The GPL stickiness holds it all together.
There are many things that could speed up development only home users care about. A big one I think is hardware companies playing ball, providing spec, if not actual engineering time. Binary blobs are a fudge, it’s not really taking part, its being at the party but sitting in the corner talking to no one and being antisocial. They have pie, but no one else is allowed any, it’s theirs only. It’s why projects like Nouveau are so important, it’s the process of kicking out those who don’t want to be there anyway.
I think you answered your own question. Servers and routers don’t need a desktop so the money doesn’t go there.
I love desktop Linux. I love its spirit, it quirks, and its uneven quality. I love its universe of choices. I also love the fact that it is an unpopular niche curiosity.
I’m am not a zealot though. The Amiga burned up my zealotry. The day that Linux breaks 10% market share on the desktop is the day I move on to BSD or OpenSolaris or something else. I say that because in order for Linux to achieve that kind of popularity, it has to become a slicked-up and focus-grouped monolith with its universe of choices left on the cutting room floor.
this time i have to disagree with you
making x crash-save is a sole thing between the x-server and the x-libraries on the application-side
all the lib has to do is cache the necessary information about windows and views (or whatever is used by x), restart the x-server when it’s down, hand over the cached information, and make a full redraw of the window