Linked by Howard Fosdick on Sat 7th Jun 2014 00:53 UTC
Xfce Over the past several years, mobile devices have greatly influenced user interfaces. That's great for handheld users but leaves those of us who rely on laptops and desktops in the lurch. Windows 8, Ubuntu Unity, and GNOME have all radically changed in ways that leave personal computer users scratching their heads.

One user interface completely avoided this controversy: Xfce. This review takes a quick look at Xfce today. Who is this product for? Who should pass it by?
Thread beginning with comment 590330
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Not so light under the hood
by Doc Pain on Sat 7th Jun 2014 10:06 UTC in reply to "Not so light under the hood"
Doc Pain
Member since:
2006-10-08

While the XFCE components are lightweight and modular, they have some not-so-light dependencies. The power manager depends on consolekit, polkit, udisks and upower; polkit in turn depends on spidermonkey, a JavaScript interpreter! Similarly, the Thunar file manager automount plugin depends on consolekit, polkit and udisks.


And sadly, those dependencies make it less portable in regards of non-Linux operating systems. :-(

I've been a big fan of XFCE (v3) and Xfce (v4), but in my limited experience, it has become less usable on FreeBSD (my primary OS). While it works as a whole, some functionalities (especially power and disk related) require specific tweaking outside of Xfce to "make it work by different means", and those are very unpleasant means. Some months ago, I had tried to get "everything" running with FreeBSD 10, but I had to move to Gnome because Xfce didn't deliver the expected results anymore, while requiring many system services and using more resources than I would have thought. So it's not just about the amount of dependencies it will install, but also about the services it requires to run. That might be no problem when running Linux on a recent PC, but for older computers (non-multicore, less than 1 GB RAM, no 3D graphics card and so on) it's definitely not so good, especially when not running a tailored Linux (no mainstream "big ass" distribution which includes, installs and runs everything plus the various kitchen sinks).

Furthermore, I must admit that I miss the simple, yet "powerful enough" interface of XFCE (v3) which was a configurable CDE lookalike. Sadly, it isn't maintained anymore - requires Gtk 1, has no Unicode support and does not integrate with "system services" like Xfce (v4) does. Still it was very fast, had "hooks" to make things work (like dealing with disks with xfmount) and didn't require much learning. In this "traditional" sense, it was a perfect replacement for users coming from a Sun Solaris background (with CDE), but also easy to adopt for people coming from "Windows" land. (And I still have a P2 system running it on top of FreeBSD 5 including office, multimedia, graphics and programming applications - works perfectly.)

Earlier versions of Xfce (v4) were also on the FreeSBIE live system CD, and it's still a good GUI environment for systems run from optical media: to try out Linux, to use it as an emergency system, or simply for testing purposes.

Reply Parent Score: 8

tylerdurden Member since:
2009-03-17

Honestly, that's more an issue of the BSD folks IMO; the source is there.

Reply Parent Score: 2

Doc Pain Member since:
2006-10-08

Honestly, that's more an issue of the BSD folks IMO; the source is there.


I'm not sure it is that easy. Sure, the source of the Linux kernel (which is a primary dependency) is there, but the BSD kernels are different. Maybe it's even impossible to implement certain things which are too specific (cgroups, u{dev,disk,power,whatnot}). Remember that it's not just about porting or patching simple things - Xfce depends on many kernel functionalities and also system services that do not exist in BSD. And imagine the fun if systemd becomes a required dependency... ;-)

Maybe it's even about "wasted work". Take HAL for example. BSD was lacking behind in HAL support when it became a major dependency of KDE, Gnome, and X itself, often together with DBus. When it started working reliably, it had been obsoleted in Linux already, which moved on to the "u* framework". Still HAL stuff is stuck in many components of the system which has to kept working, or a massive loss of functionality would appear. There probably has to be some reasonable judgment about "if it's worth the trouble". It could also happen that a fork is being created, free of the "Linuxisms", probably lacking certain functions for some time until they get re-implemented in a BSD-specific or even generally portable manner.

But those are just my individual assumptions. If you are interested in details, you should contact the BSD folks directly.

Reply Parent Score: 6

coreyography Member since:
2009-03-06

Perhaps all these DEs should just quit claiming compatibility with "Unix-like OSes" and just say "Linux".

The BSDs have several issues:

1. Less manpower than Linux working on this stuff, especially when you're not talking about FreeBSD;

2. The rapid changing of Linux's interfaces (the hal/*kit/u*/systemd saga someone referred to is an example), especially when it is felt by some in the BSD community that this is a result of not of enough thought and proper engineering up front causing a lot of "scrap and start over" later on.

At least one BSD distribution (PC-BSD) felt that the Sisyphean task of implementing frequently-changing Linuxisms just to make DEs work is not worth the effort, and is building its own DE (Lumina).

Reply Parent Score: 4

bassbeast Member since:
2007-11-11

The problem with keeping older computers is thus...pretty much anything made during the so called "MHz Wars" from 93-06 had ZERO time devoted to power saving so if you sit down and do the math the amount of useful work you are getting for the amount of power you are using? its just not worth keeping.

Lets say that older PC you are talking about is a late model P3, say a 1GHz. According to CPU world the CPU of a Coppermine 1GHz P3 uses 25w and of course this 25w is constant since there is no energy saving features in these chips. To give you something to compare it to an AMD Sempron quad in socket AM1 uses 25w WHILE giving you full HD (the Sempron is an APU) AND four cores AND an extra 400MHz per core AND full surround sound AND GB ethernet AND...see the problem?

Frankly the older systems made before the advent of the Core series on the Intel side and pre AM2 on the AMD side are really not worth keeping, the amount of power you use versus the amount of useful work just doesn't add up.

Reply Parent Score: 3

Doc Pain Member since:
2006-10-08

The problem with keeping older computers is thus...pretty much anything made during the so called "MHz Wars" from 93-06 had ZERO time devoted to power saving so if you sit down and do the math the amount of useful work you are getting for the amount of power you are using? its just not worth keeping.


Yes, this is correct for most of the times. Still it's possible that if you have to work with what you've got, you can still turn a (quite low power) Pentium 1 machine into a usable file server. Of course you get much better results if you're willing to invest money, for example, in low-power mainboards (usually ARM based) and "eco disks" or SSDs. On the other hand, wasting a 2 GHz computer with a fat GPU and a 800 W power supply just for browsing "Facebook" doesn't sound that appealing, too. :-)

Older PCs are still found in many places, and there are still people wanting to use them for something, instead of participating in the annual "throw away, buy new" dance that keeps industry happy. Those formerly were happy about installing Linux and Xfce on that kind of systems, and it was no problem to use them, because they were sufficiently fast and secure (unlike, for example, when people try to install pirated copies of outdated "Windows XP" on it). And if resources were too low to run "mainstream Linux", those people would simply switch to a different OS like FreeBSD or OpenBSD and still use Xfce for its lightweight, but powerful features.

With Xfce not being able to deliver portability and efficiency anymore, other more lightweight desktop environments (and maybe even preconfigured and tailored window managers) could become more interesting as a base to build a fully-featured system consisting of OS, desktop, and application software. But as soon as you enter "too fat" applications to the mix, you're back at the initial problem. :-(

There are also non-profit organisations which are in the business of avoiding the huge pile of office waste (computers and printers), and instead install them with Linux and donate them. This is especially interesting for people who want to learn about computers and achieve experience, but simply cannot afford to buy a new one, even though computers become cheaper and cheaper. But with the continuous "renewal" especially of smartphones (buy a new one every year, throw the old one into a garbage can), tablets and laptops, maybe "component-based" PCs will also become less and less relevant to the general public. And when people don't see the waste they're creating, they don't care. (Maybe I just grew up with the wrong mentality, as I don't feel very comfortable with throwing away something that fully works, just because industry tells me it's "old".)

Frankly the older systems made before the advent of the Core series on the Intel side and pre AM2 on the AMD side are really not worth keeping, the amount of power you use versus the amount of useful work just doesn't add up.


Basically, I agree with this, but allow me to add:

In today's modern PCs, the ratio is still better, and given the assumption that you hardly use 10 % of the resources of the computer, the waste of energy is less (not in relative, but in absolute amount). The problem often isn't that the hardware stops working, but because the software demands more and more resources to perform basically the same tasks (from a user's point of view), which is compensated by buying better hardware, creating toxic waste as a side effect, just to keep the same "average usage speed". On the other hand, there are many features "hidden" from the user which depend on the availability of 4 GB RAM, a 3D-enabled GPU, or the presence of multiple CPU cores. Without the general (and increasingly cheap) availability of those, development would not go into that direction. Maybe that's also the reason why there is so much bloadware - because nobody notices when something is inefficiently programmed, as it's cheaper to simply buy faster computers than to perform an efficiency-oriented code rewrite. (You usually find this mentality in business software.)

For example, I've recently seen a top-of-the-line PC installed with "Windows 8", and I was surprised how terribly slow everything was. The person in question had owned many high-end PCs over the years, all of them equipped with the then-current "Windows", and he told me that he never actually noticed that something became faster, even though he always bought the newest and fastest PCs; instead, he felt like software became slower with every release. And people somehow accept this as being "normal"... now imagine what you could achieve with such hardware if you just added the proper software!

Reply Parent Score: 2