A reader asks: Why is Linux still not as user friendly as the two other main OSes with all the people developing for Linux? Is it because it is mainly developed by geeks?
My initial feeling when reading this question was that it was kind of a throwaway, kind of a slam in disguise as a genuine question. But the more I thought about it, the more intrigued I felt. There truly are a large amount of resources being dedicated to the development of Linux and its operating system halo (DEs, drivers, apps, etc). Some of these resources are from large companies (IBM, Red Hat, Novell). Why isn’t Linux more user-friendly? Is this an inherent limitation with open source software?
I won’t pretend to give an authoritative answer to this question. All I hope to do with this article is posit a few possibilities and open the topic for discussion. First I think we should try to clarify the question by defining user-friendliness. ;Often, user-friendliness is conflated with beginner-friendliness, and this is a grave error. Just because the use of an object isn’t immediately obvious to a new user doesn’t mean that it’s not well-designed or even that it’s not easy-to-use. I have a nail pulling tool that’s unquestionably the most effective nail-puller you can buy. But whenever I hand it to someone, even an experienced carpenter, who’s never used one, they give me a blank look, and it usually takes five minutes or so to figure out how it works. They make nail pullers that are much easier to understand, but they don’t work as well. They take longer and damage the wood more. But the thing about the slide-action pullers that I like is that even though they take a few minutes to learn, once you know the trick, they save you several minutes with each nail. Over a lifetime, a tool like this might save a professional carpenter several entire workdays. You could say the same for computer users.
Point and click interfaces are much easier for computer neophytes to understand. You can click and hunt through the menus to figure out what to do, and once you have the basic concept of moving the mouse and navigating menus and buttons figured out, you can muddle through most tasks. Most importantly, you don’t have to memorize any arcane commands. But for all-day computer users, mousing your way through menus wastes a lot of time and makes things hard that can be easy. Learning a few keyboard shortcuts for the things you do every day (opening windows, saving, etc) and learning to use the command line, or command line-like tools (Such as Quicksilver or Colibri) will give you a major productivity boost. In short, often the attribute that makes a tool easy to learn actually makes that tool less useful to an expert.
In some cases, “easy-to-use” tools aren’t better even for newbies. Training wheels for bicycles don’t really help kids learn to ride a bike properly, and in fact can easily delay learning. A CLI can be superior for neophytes in some cases because it can be much easier to do complex procedures by copying and pasting strings of command line code from a tutorial or forum posting than following a circuitous walk-through of the same procedure (with screenshots) using a GUI method.
Conventional wisdom would dictate that the menu-driven GUI of a Mac or Windows machine would be the major determinant of its user-friendliness, but I disagree. I actually think that the time-worn “just works” measuring stick is a much bigger factor. Even if you have to learn a few conventions or [gasp] memorize some basic commands, if your computer operates in a reliable and consistent manner, it’s easy-to-use. I would hasten to add that installing new hardware or software and configuring your preferences should also be as simple as possible and should not break the computer in the process. The times that using a computer becomes the most challenging for any user is when something goes wrong, so the easiest to use computer is most certainly the one that never fails for any reason. But since we don’t live in fantasyland, a computer should minimize opportunities for either spontaneous or user-caused failure, and should fail in a way that minimizes collateral damage and guides the user through the proper steps to bringing the machine back up to proper operation.
This is where we can start to look at the relative user-friendliness of Mac, Windows, and Linux. Traditionally, Macs have been seen as the most user-friendly, and have adhered to the “just works” philosophy. In some cases over the history of the Mac, this has been taken to the extreme, where simplicity has actually stood in the way of power users’ productivity. But “just works” comes at a price, and that is the vertical integration of the Mac marketplace and the dearth of hardware options. In its current incarnation, the Mac OS is reliable, fails gracefully when it fails, makes installing hardware and software easy, and makes configuring features such as WiFi and Bluetooth quick and easy. This is user experience exists in large part because of the extreme amount of priority that Apple puts on user-friendliness and the tight control over the hardware ecosystem that Apple maintains. But it’s not all Apple. Ironically, one of the most important aspects of Mac OS X’s user-friendliness, its reliability, comes from its BSD-based underpinnings. The Pre-OS X versions of Mac OS, particularly those after version 7.6, in the late nineties, were not very speedy or reliable, despite being easy to use in other ways. It was only harnessing some of the inherent advantages of the open source development model that brought Apple’s OS to where it needed to be.
Windows is generally thought to be pretty user-friendly, but even a quick survey of the average computer user’s experience with Windows will show you that most people are not at all comfortable with it, especially once something goes wrong. Using Windows for everyday tasks is quite easy, and seems to get better with each version. In fact, I’d say that in some ways Windows has easily equaled or surpassed Mac OS X in some ways when it comes to basic usability like application launching and file management. Installing software and hardware are not as elegant as the Mac, but don’t pose a problem in most cases. And a clean Windows install on high-quality hardware will run extremely reliably compared to the standards of the nineties. The problems arise when poor quality software or hardware are added to the mix, because troubleshooting stability or performance problems in Windows can be very difficult. Also, doing advanced configuration, such as networking, can really be a brain-breaker, even for experienced users.
And thus we come to Linux. Once you get over the initial learning curve, Linux makes a very powerful tool, and its flexibility and stability can even make it a superior tool. But Linux suffers from the same problems as Windows when it comes to obscure or poorly-built hardware or software, and problems with troubleshooting and advanced configuration can likewise be difficult for non-expert users. Linux has the advantage of superior reliability, but this advantage gap has narrowed substantially over the past decade as both Windows and Mac systems have made major strides in this area. But where Linux really falls down is in “operating in a reliable and consistent manner.” Getting new hardware or software to work in Linux can be easy, or it can be hard. And the methods that you might use to make something new work in Linux are varied, and in some cases, quite a drawn-out process. You might need to install frameworks or libraries, and in each case there might be more than one way to do each thing, with no clear indication of which method would be best or why. The hardware you might want to use might not be supported, or might only be supported by way of complicated hacks. If you want to make a configuration, there may or may not be a user-friendly graphical config tool, and if there is one, it might not work the way it should, depending on other factors. I won’t even start into the whole X Server and DE conversation.
It seems that the most user friendly OS is also the most tightly-controlled. Microsoft brings on a lot of its own troubles on itself by providing so much backward compatibility and making it easy for its developers to be lazy by using cheap hacks and old APIs, but a Windows developer has more freedom than a Mac developer does. Linux developers, of course, have even more freedom, and that’s one reason why Linux is all about options. There are at least two ways to do everything. In large part, the freedom is a contributor to Linux being difficult for non-experts.
But that’s not the whole story. There’s also the decentralized nature of Linux development to blame. Both Apple and Microsoft can rely on a central authority for user interface conventions, and of course can focus their efforts on a single desktop environment. The Linux community is riven by fundamental philosophical disagreements on how to do things, GUI-wise, and even where there’s agreement, the various players are still largely acting independently, or at best are organized into a loose confederation of interested parties.
Then we have the “geeks” angle, as posited by our questioner. Because so much Linux development is driven by advanced users’ imperative to scratch a particular itch, the areas of Linux development that receive the most and best attention are the areas that are of particular concern to its most highly-skilled users. That’s why stability and performance are Linux’s strong suits. Most Linux alpha-geeks are quite content with the everyday usability of their particular setups, and their advanced skills make things like troubleshooting and configuration pretty easy for them. So yes, the fact that “geeks” are developing Linux for themselves is a major contributor to its user-friendliness deficit.
But what about the big companies working on Linux. Don’t Novell or Red Hat have some interest in improving Linux’s user-friendliness? Well, yes, but not really. First off, no company working on Linux has nearly as many engineers working on GUI issues as either Apple or Microsoft. Second, they never will, because the Linux companies aren’t very interested in Linux as a desktop or workstation OS. The big money in Linux is in servers and in “enterprise” applications. Linux as a server is easily as easy-to-use as Windows or Mac. Both Microsoft and Apple make great server OSes, but in some ways their adherence to desktop conventions put them at a disadvantage against Linux and all of its UNIX server heritage. So Linux companies are putting a lot of effort into making Linux more user-friendly, but more user-friendly as servers. Companies like IBM, Red Hat, and Novell do some business setting up big networks of workstations or even desktop PCs, but they tend to be relatively locked-down, centrally-managed systems where professional sysadmins wouldn’t want the users mucking around too much anyway, and the available DEs and apps for Linux really do get the job done in that setting. Or they’re workstations for “geeks” so the previous point applies. So I wouldn’t expect much action there.
I think the best hope for a sea-change in Linux usability would have to be an initiative like Google OS, where Linux is chosen to be the underpinning of a new, user-friendly OS by a large company that will unilaterally undertake to create an elegant user interface on top of Linux, much as Apple did from BSD. And just as an Apple user can launch the terminal and get all UNIXy, such a project would still be Linux underneath, and would hopefully still accommodate those people who treasure having half a dozen ways of doing everything.
So this question of Linux user-friendliness can be attacked at its roots, by debating what really constitutes user-friendly, or it can be addressed head-on, by examining what could be done to improve Linux’s accessibility by new users. I’m sure there’s a lot more to say. What do you think, OSNews readers?