posted by Adam S on Fri 17th Sep 2004 10:27 UTC
IconYesterday, a co-worker of mine and I had a lengthy discussion about this article posted on OSNews awhile back. My past writing about Linux has centered on general usability and sensible defaults, but his contention was that Linux is the Linux kernel and that anything beyond that is the responsibility of the distribution. The conversation took an interesting turn. Read on for more.

Certainly, the word "Linux" refers to a kernel. I believe, much like the old confederate flag, insistance on what it really is means little when most people perceive it as something else. "Linux," that is, what it means to most of the word, is a complete, rapidly developing, open source operating system. Each distribution provides a mostly unique spin on it, and generally, most people's perception of Linux is actually the spice added by their selected distribution. Some people think Gnome is Linux, some KDE, still others XFCE or Windowmaker. As a result, unsurprisingly, the choices a distribution maintainer makes affect not only their users, but the perception of Linux as a whole.

If you ask me, I'll tell you straight out: I'm all for removal of choice from Linux distributions. Choice is generally good, but too much choice, and worse, uninformed choice, is bad. It's no different than politics, choosing a text editor in Linux without any knowledge of how they work is like voting without any research into how the candidates stand on the issues. Fine. So let's talk about choice.

As I said in the past, I believe a distribution should choose sensible default options and applications and leave the rest out. A user who wants specific applications should either a) choose the ditribution which best suits him by including them or b) download and compile those applications himself. Yet, everywhere I go and everyone I speak with tells me otherwise - Linux is about choice and removing the choice is akin to personal insult. It's a strong voice that argues with me: "Choice is good." But it's not consistent.

Why? Because there's another debate going on, unrelated, but behind the same four walls. That debate is about standards. Microsoft's Internet Explorer has some amazing capabilities. Sure, I'll give you that ActiveX and proprietary javascript elements can be terrible and/or security holes, but this is my point: the same community that argues for choice seems to stand behind consolidation and standards. I mean, in the last few years, we've seen the rise of open file formats, XML (and XHTML/XSL/RSS/Atom), and most recently, the new release of the Linux Standards Base (LSB2.) Sure, standards make everything easier for application developers to provide a consistent and clean experience, but at the heart of it, aren't they also a kind of "removal of choice?" If guidelines tell you where you must store user data, or how your code must be written, doesn't that limit the philosophical choice to do whatever you want? The same way people tell me "If you don't like emacs, don't use it," why don't people say, "if you don't like IE's behavior, don't code for it."?

I don't pretend to be a supporter of non-standardized applications, protocols, and organizations, but I do wonder why there is such heated debate and inflexibility. There are keywords -- FUD, troll, zealot -- words that people call each other that incite the deepest sense of insult and argument. There is allegiance and faith in hardware, software, programming languages, and interfaces seemingly comparable only to religion in starting all out "flame wars." Where, I ask, does this come from?!

We accept choice when it suits our needs and reject it when it doesn't.

Table of contents
  1. "Choice: Page I"
  2. "Choice: Page II"
e p (0)    153 Comment(s)

Technology White Papers

See More