Linked by Howard Fosdick on Sat 7th Jun 2014 00:53 UTC
Xfce Over the past several years, mobile devices have greatly influenced user interfaces. That's great for handheld users but leaves those of us who rely on laptops and desktops in the lurch. Windows 8, Ubuntu Unity, and GNOME have all radically changed in ways that leave personal computer users scratching their heads.

One user interface completely avoided this controversy: Xfce. This review takes a quick look at Xfce today. Who is this product for? Who should pass it by?
Permalink for comment 590494
To read all comments associated with this story, please click here.
demetrioussharpe
Member since:
2009-01-09

Other reasons that Linux must churn is that most hardware is actually badly implemented and so the subsystems must change once in a while because the amount of specific work arounds start working against each other.


Sounds like the solution is pretty obvious -stop piling workarounds on top of each other. It's like finding a victim with a shotgun wound on his chest & trying to patch him up with lots of little bandaids, instead of applying a proper dressing. This isn't a technical matter, it's a social & managerial matter.

if(HerdingCats(sLinuxDevelopmenters) {
StopAllDevelopment();
RemoveUnorgizedDevelopers();
AddFreshDevelopers();
ActuallyEngineerSolution();
ImplementSolution();
ScrapAllOldSolutions();

if(SolutionWorks()) {
CommitSolution()
}
}

It's not rocket science.

(My indention spaces were automatically removed by the site software.)

Designing things to be perfect from the beginning only works if nothing ever changes. Like I said before, Linux churn is a result of previous Linux churn. BSDs don't experience this because they don't have previous churn to force them. There's no positive feedback loop.


Now, you're just making excuses. Churn can always be stopped. Churn could've been stopped when the development started on each major version of the Linux kernel, but no one bothered to actually do it. Every project has a beginning, so saying that the BSDs had no original churn is no excuse. Linux could've started without original churn, but it didn't. Linux could've transitioned to a system with less churn, but it didn't & it won't. Like I said earlier, this is a social & managerial mater -not a technical one. By the way, you're right, Linux churn doesn't exist in a positive feedback loop.

So yes, it is somewhat a matter of funding and resources. Change creates more change, and Linux has a lot more sources of change from outside that it becomes a juggernaut. Linux very much can't say "stop giving us code" for long.


If a project can't reject badly designed & badly engineered code, then that project has serious issues. What's funny is that you really believe that bs. I'm looking at what you wrote & I'm matching it up with how many years came & went that were supposed to be the year of Linux on the desktop. If your desktop can't stabilize because the kernel is churning faster than a dairy farm produces butter, then is it any wonder that the year of Linux on the desktop never arrived? This discombobulated approach to development has forced Linux to play catchup to TWO OSes, where it originally had to play catch up to just one. I've been around for the development of both Linux & the BSDs. As trashy as Windows can be, there's no doubt about the fact that it's still more seamless than Linux...when it's not crashing.

Reply Parent Score: 2