The founder of the Open Graphics Project writes: “Good design and usability are very important. I haven’t paid enough attention to the discussions between Linus and GNOME developers, so I can’t address it directly. But what I can say is that a learning curve is not a bad thing. While it’s good to think about the total novice, it’s even more important to have consistent and logical mechanisms. This way, if someone has to learn something new to use the computer, they have to learn it only once. This is why I think it’s good that Apple and Microsoft have UI development guides that encourage developers to make their apps act consistently with other apps in areas where their functionalities conceptually overlap. And this is where I start to get disappointed with GNU/X11/Linux systems.”
So if Microsoft and Apple have these UI development guides, why is it that every release of itunes looks different to the rest of OSX? Office XP had it’s own menus, then 2003 went all blue. And Windows Media Player always looked different – then when they invented a new flashy UI, they gave WMP11 a different one again!
Yeah, okay, Linux is a long way from perfect in that regard, but if it’s going to be criticised for it’s inconsistency, can we at least find something to compare it to that is consistent itself?
Could it be that iTunes is cross platform and the Aqua don’t go well with Windows!
iTunes looked different long before it was cross platform
It may have been cross platform internally at Apple from the beginning.
Besides iTunes doesn’t look or behave that different from other apps on OS X. iTunes and iPhoto, for instance, have very similar layouts with essentially cosmestic differences.
If you want to complain about an app that IS different from the rest of OS X, why not target Garageband? That is a truly different application.
I didn’t chose iTunes, whoever was replying to did. Anyway both Garageband and Shake are both good candidates, and you know what I don’t mind in the slightest. Find that people place far too much weight on the importance of UI guidelines (I’m not saying that they are useless however)
Shake for example looks and feels totally different from just about any other app out there, yet I find it by far the easiest compositing tool to use, despite the fact that ignores just about every Apple gui guideline out there.
I haven’t used Garageband much, but from the little I have used it the non-standard gui didn’t bother me.
hell, what about the worst of them all, IE7?
Apple bought it and made some small changes
So if Microsoft and Apple have these UI development guides, why is it that every release of itunes looks different to the rest of OSX? Office XP had it’s own menus, then 2003 went all blue. And Windows Media Player always looked different – then when they invented a new flashy UI, they gave WMP11 a different one again!
Yeah, okay, Linux is a long way from perfect in that regard, but if it’s going to be criticised for it’s inconsistency, can we at least find something to compare it to that is consistent itself?
You make a good point at the start. While they do have standards to adhere to, even they don’t do it perfectly.
But the important thing here is that you notice when they go wrong — the Office apps using off the wall window colors, WMP looking crazy, etc. Standards matter, and sticking to standards matters. That’s the message to take home from all this.
I guess the author of this article is unaware of the GNOME Human Interface Guidelines (HIG), which must be followed in order for an application to make it into the default GNOME desktop suite:
http://developer.gnome.org/projects/gup/hig/2.0/
KDE has a similar project that’s a major part of the development process for KDE4:
http://wiki.openusability.org/guidelines/index.php/Main_Page
Free software desktops have a great advantage in the area of UI consistency because we have the notion of a cooperative community. In proprietary land, it’s all about branding, marketing, and differentiation. There’s absolutely no motivation for consistency. In free software, following the HIG can help expand your userbase significantly.
The Hypocrite paradox.
People/Groups are not perfect they stretch and break their own rules. It doesn’t mean their rules are bad or we shouldn’t follow them. You pointed out a couple of apps that broke the rule while the bulk of the system follows the rules. Vs. Linux who concept of interface consistency is that there is some way to close the window that you opened. (most of the time)
“You pointed out a couple of apps that broke the rule while the bulk of the system follows the rules. Vs. Linux who concept of interface consistency is that there is some way to close the window that you opened. (most of the time)”
Linux apps are increasingly consistent, while Windows apps are increasingly *less* consistent.
Not that it really matters. Consistency is overrated.
If you’ve used iTunes before, you instantly understand how to interact with the next version, despite the visiual differences.
This proves the quality of the interface: consistency is more than skin deep and actually have little to do with how an application is skinned.
I agree – if you look at the GNOME applications, for example, the GUI consistancies are amazing; compared to the Windows worlds where there are a missmash of 100s of differences types of UI’s – you have the Windows Media Player, Office 2007 etc.
The worse part is when you hear their excuse; I can put up with the differences, it is when you have the half assed excuses, “oh, its branding!”, “we’re trying to make out applications stand out” are the excuses used by Microsoft and Apple developers.
Sorry, I don’t *need* that, give me a media player that is made up using standard widgets, I don’t care about skinning, I don’t care about ‘standing out’ – give me a good quality application, and I’ll use it.
What happened to the good old days when companies used to compete based on making the ‘better application’ rather than creating crappy obscure interfaces which are pig ugly and do nothing to improve the usability of their application.
‘What happened to the good old days…’
– You yourself dumped them when you stopped using the command line and started using Gnome. Of course, you seem not to have noticed when it’s your own ‘religion’.
‘creating crappy obscure interfaces which are pig ugly and do nothing to improve the usability of their application.’
– Funny, reading the comments that OsNews has from time to time, one could easily see this comment being made in regard of an article about Gnome.
Edited 2007-04-01 10:51
I was thinking the same. No UI is more INconsistent than Windows. Every app has it’s own flashy GUI, looking entirely different than the rest of the system – and even Microsofts apps itself don’t follow the same guidelines. When I compare this to my Gnome desktop, where EVERYTHING except Skype looks the same, behaves the same – well ok on Linux, if you really want consistency you should stick with either Gnome or KDE, mixing apps breaks consistency, though it’s still not as bad as on Windows.
OpenLOOK and Motif???
People must develop them further and stop moaning because both provide an excellent foundation.
And please stop that “hey man… they are dead projects”.
They are not.
They really need to die….fast.
Why on earth Motif has to die?
It is quite old but still offers fairly nice look and feel (if you know how to set it up properly). And it isn’t as resource hungry as some of the more “modern” solutions…
It’s old. It lacks a large number of widgets that developers readily use today (can you say tree view). It’s not pretty. It has no community. People aren’t developing new applications in it. We have better toolkits, probably in every regard (I bet there’s even a toolkit that uses less resources). It’s not going to get updated for modern hardware.
Enough reasons for why Motif is dying? It’s not that it has to, it’s just that it is.
I agree that Motif is clearly an old and somewhat “legacy” toolkit. It, however, does have some good points still…
0. Surely age does not imply it needs to die? Vi and Emacs are quite old, you know…
1. Stability. Stability. Stability. Stability. (or stagnation, if you will…) This can also be a benefit. Most Motif apps Just Work(tm) when I compile them from source. Have you ever tried to compile e.g. latest version of Inkscape on a RHEL4 machine? You have to compile also glib, cairo, pango, atk, gtk and gtkmm.
2. Not enough widgets? That one may be true. However, for simple apps you don’t need so many advanced widgets…
3. The non-existence of a community is a real problem. I agree.
4. Good looks are in the eye of the beholder… And by tweaking your .Xdefaults you can configure Motif to look tolerable…
*enableEtchedinMenu: True
*enableThinThickness: True
*enableToggleVisual: True
So I basically agree that Motif is dying, but I don’t agree that it should. Maybe Gtk and Qt are not ready to replace it quite yet… What Linux needs right now is stable API (and ABI) for GUI programming and programmers who (instead of running after the latest and greatest libs) use stable API. This would simplify system administration, software installation and software packaging tasks enormously.
And as for this “application files should be in one place” thing, I have one question: Why?
In Linux, the package manager will look after them for you. I can’t remember the last time I went trawling through /usr looking for some obscure app file – why would I need to?
In Windows, you have to be able to get at them, because Windows totally fails at providing any sort of install/uninstall framework, so stuff gets left all through your system that you have to clean up yourself. Or something decides to save it’s data under it’s Program Files folder, so you need to find it to back it up.
This also leaves apps to decide where they put things like executables. So when you get the “Open With” dialog, and the app you want isn’t on the list, you have to go searching through Program Files. Wouldn’t be so bad, but last time I did this I was trying to open a VS7 project in VS8. Where does VS8 keep it’s main executable? Obviously it’s under it’s Program Files folder _somewhere_, but nowhere totally obvious. In the end I abandoned it and opened the file from inside VS, but there was still this niggling doubt saying “on my linux box I wouldn’t have had to know, it would have been in the path. easy.”
And if your app of choice is not available in your distribution’s native format?
“And if your app of choice is not available in your distribution’s native format?”
You could still use the classical way (./configure && make install), the Makefile would tell you where files have been installed to (see target “install”). Would work nearly everywhere.
That beats the point of this article. You have to manually get the libraries/dependences/developmen files, and that’s not easy to use, and a reason it lags behind.
“That beats the point of this article. You have to manually get the libraries/dependences/developmen files, and that’s not easy to use, and a reason it lags behind.”
You’re right, of course. I didn’t talk about how complicated it is and / or might get, I did talk about that it’s possible nearly everywhere. It works fine for small programs, but if you get into “dependency hell”, you’re nearly lost. So you will have to be educated enough to know how to solve these programs and how to read Makefile. So this solution is mostly designed for experienced users that don’t mind to get their hands dirty. It’s no solution for Joe Q. Sixpack and Jane Foobar. 🙂
And how this that different from OS X or Windows or any other OS. If it want to run an open source app on windows and no one has made an installer I have to compile from source, same with OS X and every other OS.
At least Linux tries to make this annoying experience marginally less painful.
Or I just find a binary somewhere, unzip it somewhere, ctrl-shift-drag a shortcut onto my Start menu, and be done with it.
I find that usually SOMEBODY will be building Windows binaries for things, even if the project only supplies Windows source.
That part is automatic done by package manager. Besides the one place thing is completely crap – how can it manage dependencies? Every apps on windows just end up having a copy of all libs they use.
Internal file structures for apps should be hidden from the user and managed by the system (and not the apps/setup). No user should copy/move/remove apps by himself, or even do any stupid click/interaction during installation/uninstallation.
checkinstall
http://asic-linux.com.mx/~izto/checkinstall/
The standard way to install files that are not part of your distro is in /opt, and under there, everything should be basically organized like c:\Program Files\ in Windows.
Edited 2007-03-30 23:39
And if your app of choice is not available in your distribution’s native format?
Then, for the novice user, it simply doesn’t exist, and they’ve probably never heard of it anyway. On Windows, if there’s an application that doesn’t have an installer, then it’s pretty inaccessible to all but the most experienced users. On Linux, if an application doesn’t have a package for you distro, then there are usually instructions that allow moderately experienced users to install it manually.
More pertinent to the topic of TFA is that Linux systems have unified interfaces for installing software. The process of installing and updating software is the same no matter what package you’re installing. While Windows installers are undoubtedly simple to use, there are as many flavors of installers as there are major Linux distros. The uninstallers often don’t work correctly, and there is no simple way of receiving automatic updates to application software.
The issue of package availability gets raised here often (frequently by you, Thom), and I think the fundamental response is that this isn’t a technical issue, but rather one of economics. It’s far, far easier to package software for a Linux distribution than it is to package software for Windows, and the installation method is arguably far, far more convenient. From a technical standpoint, the Linux distributions are beating the pants off of Windows in software packaging, more or less out of necessity. The amount of software available for Linux distributions is staggering given the market share story. For a niche market, the Linux desktop early-adopter crowd has been remarkably well-served by the Linux distribution projects. If the burden was on the upstream developer to ensure platform binary compatibility and desktop integration, then we wouldn’t be where we are today.
Then, for the novice user, it simply doesn’t exist, and they’ve probably never heard of it anyway.
There’s no such thing as a novice user. Users don’t simply fall in either the category of “experts” who know how to fiddle with the system to get something not packaged running or the catrory “novice” who are too afraid to fiddle with the system hence won’t install an app which is not pre-packaged. Even “experts” can hose their system.
The price of the package management convenience is a centralized system and dependency (sometimes I’m really astonished that Linux users who preach freedom as important went that way).
IMHO, package management was only the second best answer to the problem that came when *nixoid systems came to the desktop. Instead of having a well-trained full-time sysadmin which cares for installation of new software and which is able to keep the system clean, you now have individuals which uses *nixoid systems at home for various things and which are users and not admins in the first place. For them, the traditional *nixoid FS-layout is probably not the right thing and software that splatters files around the FS are a problem. Hence you either have to make things easier for desktop *nixoid users (e. g. change the FS-layout and the way software get’s installed like OSX does) or you need a helper tool that hides the complexity of software de-/installation and acts as an abstraction layer to the FS.
My main concern about package managers are that they break the golden KISS rule. Instead of making the system (FS-layout, installation) easier, you’re now dependent on an abstraction. Abstractions tend to break at some point. Package management software becomes a single point of failure which can destroy system integrity or functionality because of bugs (every software has bugs) or “expert” users breaking basic assumptions built into the package management software (every software has builtin assumptions about the environment, which – if they are not valid anymore – can lead to al sorts of unpredictable side effects).
Then you have to go with FreeBSD, you will like the about 17.000 ports, ready to install and solving dependencies
Apart from that, look at Mac OS X, it hasn’t got the vast variety of applications like in Windows world. Are Mac OS users doomed now?
>Here we are exposing one of the fundamental problems of Linux systems. It’s all about ego.
Isn’t this a very bewildering saying out of the mouth of a Mac user? omg
Apple sells dreams, dreams with no connection to reality – you have to have faith, faith in Jobs. So if you’re professional you have to go with Windows or real Unices.
Screwdrivers vs. Couture
This is the real problem, the latter is Apple.
The problem is that the application managers DO NOT always look after them for you. Things break, and the graphical tools to manage them don’t always do the right thing. And what about apps that you can’t get via apt-get or yum, which you have to install manually? What then? Besides, the thrust is about keeping application files and config data organized better. Linux has some standard on where to keep the main binary and where to keep the main config file. But as for other application data, that is done haphazzardly. And of course, the config file thing is a MUCH bigger hassle than where the binaries are kept, because I have to agree that the package managers at least USUALLY handle the app binaries right.
I’ve never ever had a problem with applications not finding their associated data, which is usually installed in /usr/share or /usr/lib. The only problems I ever have with package managers are with superficial library conflicts. For example, an application is looking for libexif.so.10, but I have libexif.so.12. So I do ln -s libexif.so.12 libexif.so.10. Problem solved. I’ve thought about writing an enhancement to the Linux loader that attempts to make symlinks like this if an exec() fails on a missing shared library object. When I get the time, I might investigate this further.
I’ve never ever had a problem with applications not finding their associated data, which is usually installed in /usr/share or /usr/lib. The only problems I ever have with package managers are with superficial library conflicts. For example, an application is looking for libexif.so.10, but I have libexif.so.12. So I do ln -s libexif.so.12 libexif.so.10. Problem solved. I’ve thought about writing an enhancement to the Linux loader that attempts to make symlinks like this if an exec() fails on a missing shared library object. When I get the time, I might investigate this further.
Umh, haven’t you ever thought about why that version is actually there? Might it have something to do with ABI?
> And as for this “application files should be in one place” thing, I have one question: Why?
> In Linux, the package manager will look after them for you.
I think package managers exist mainly because of this limitation, not as a feature. As an application developer, do you know how difficult it is to get your application packaged for every distro and OS? People and newcomers are encouraged so much to use packages, and then us, developers, have no control about which distro will provide packages,and need to rely on the help of others. It is also impossible to release beta software for testing, because a) If it isnt finished yet, it wont be packaged , b) If it’s a beta/unstable version, the package will override the stable version (it’s difficult as shit to install both an unstable but with more features and a stable but older version of the software at the same time).
So come on, packages are not really better than the windows/osx/beos approach.
klik
http://klik.atekon.de/
It might not be the be all end all solution to linux package management woes, but it’s perfect for releasing beta versions and brand new stuff.
Is this an excuse to scatter files over /usr, /usr/local, /usr/local/etc, /opt, /var/opt, /etc directories?
“Is this an excuse to scatter files over /usr, /usr/local, /usr/local/etc, /opt, /var/opt, /etc directories?”
To clarify, just read
% man hier
Installed applications that do not belong to the OS itself are installed in /usr/local. The directory /usr/X11R6 should be obsoleted, I heared some rumor about this dir getting obsoleted soon. Linux had /opt aditionally, nearly the same, but I think it’s obsoleted, too.
/etc does hold configuration files for the system, /usr/local/etc does hold configuration files for installed programs. So you can easily see: While /usr/local/-foo- is -foo- for installed applications, -foo- outside /usr/local has the same purpose, but just for the system itself.
It’s not that confusing as it might look to you.
I’ve been using computers since I was 5… in 1978. I design chips for a living. I do AI research as a Ph.D. student. The way Linux does it STILL confuses me.
And it’s all because Linux requires knowledge in the head instead of putting knowledge into the world.
“And it’s all because Linux requires knowledge in the head instead of putting knowledge into the world.”
I’m not sure I do understand, would you try to explain it a bit?
Knowledge in the head is what you think it is: Something you have to learn.
Knowledge in the world involves using natural cues or creating them so that a person doesn’t have to learn something in order to use a thing.
For instance, you can build a door that is symetrical on the left and right, so you can’t tell if you want to pull the handle on the left or the right or push or pull to open it. Architects do this all of the time for aesthetic reasons. In order to learn to use this door, you use trial and error and then eventually, after using it many times, you MIGHT remember how to use it. Knowledge in the head.
Conversely, you could form the door handle in such a way that it is plainly obvious whether you should push or pull and on which side of the door. That puts knowledge in the world so that the user doesn’t have to learn anything.
First, thank you for your quick reply.
“Knowledge in the head is what you think it is: Something you have to learn.
Knowledge in the world involves using natural cues or creating them so that a person doesn’t have to learn something in order to use a thing.”
Ah, I think I see now. You’re refering to formal education and fact (explicit) knowledge versus intuitive (implicit) knowledge. The later one has to be learned, too, but it works implicit. The learning content is transmitted by social contacts, society and cultural traditions and contexts. Natural cues (understanding spoken or written language, producing it, recognizing its content and its meaning in context, reading figures, symbols etc.) have to be adopted, to be learned, too, but as I tried to explain, this works implicit, not like the frontal education as it is in schools.
“For instance, you can build a door that is symetrical on the left and right, so you can’t tell if you want to pull the handle on the left or the right or push or pull to open it. Architects do this all of the time for aesthetic reasons. In order to learn to use this door, you use trial and error and then eventually, after using it many times, you MIGHT remember how to use it.”
Trial & error is not a programming concept. 🙂
Most OSes (or to be more concrete, GUIs) are able to be handled by trial & error. In most cases, the error does not affect the system. “I just deleted myself, how do I undo this?” 🙂
While trial & error does not harm you (in case of error) in most mechanical and computer contexts, in other contexts an error might cost your life.
“Conversely, you could form the door handle in such a way that it is plainly obvious whether you should push or pull and on which side of the door. That puts knowledge in the world so that the user doesn’t have to learn anything.”
In regards of GUIs, this would imply to make things obvious. Because GUIs use graphical metaphors and symbols, they have to be interpreted as a kind of pictural language. I may refer to the things I introduced first: If the GUI design is good, implicit knowledge will be enough. Exempli gatia, you know how a knob, a dial, a lever or a digital display looks like and how to use it, you’ve learned it in your everyday life and see it for the first time on your computer monitor. You don’t need to read a manual that explains strangely designed triggers, “rotate the green triangles to adjust the volume of your speakers” or “pull the tail of the dancing elephant to shut down your computer”. 🙂
The trick is: GUI designers have to use pictural elements most users are familiar with, founded in their individual (but common) experiences. They have to make them look familiar and work familiar.
As an aside, does anyone else really, really, really hate it when building designers put PULL handles on PUSH doors, and PUSH handles on PULL doors?
Would it kill them to put the right handle on the door for the direction of movement? It would certainly save them a lot of repair time (people walking into the door), cleaning time (for all the face/arm smudges and toe prints), and money (for all the unneeded “pull”/”push” signs).
Something so simple to fix … and yet it is so prevalent.
Have you read a book on usability called The Psychology of Everyday Things? There’s a very interesting section in it where the author analyzes various designs of doors from the standpoint of how immediately-obvious their use is.
Yeah. For a while, I listed that book as my favorite in my email signature. The latest edition is titled “The Design of Everyday Things”, because the old title got it stuck in the wrong kind of section in book stores. So, I guess I should have pointed out that I got the idea from Norman, although in the original article, I do mention Norman.
Ah, now I must go RTFA There seem to be lots of comment posts on here about usability – it’s nice to see someone who’s actually read academic material on it for a change.
Hear hear…
>> Conversely, you could form the door handle in such
>> a way that it is plainly obvious whether you should
>> push or pull and on which side of the door. That
>> puts knowledge in the world so that the user
>> doesn’t have to learn anything.
I love this analogy, I really do. It reminds me of the high school my hometown built back in 1987. They had the same style door handles on both sides of the main entrance – and hilarity ensues… ESPECIALLY since most of the outer doors and all the inner doors opened INWARDS. Of course, being the Commiewealth of Taxachusetts they just paid off the fire inspectors.
In general, the comparison of design vs. engineering is quite apt… goes back to something I said in another topic about form over function – something we’ve been seeing WAY too much of in recent projects.
Again, back to that school. They had this grandiose entranceway that an architect pulled out all the stops for. It would have been a dark little corridor because the whole thing was basically poured concrete, but they put 1.5″ hex shaped holes in the ceiling 1.5″ apart through to the 2nd floor to let light from above (both from the electric lights and the glass panes in the roof). On the 2nd floor that same hallway with the holes in the floor was entrance to the library.
… and hilarity ensues. Try going across that on a cane or on crutches, or in winter (we’re talking New England here) having all the crap on people’s shoes fall on the people down below. So they plugged them with glass – which is when girls started complaining about the fact guys were starting up their skirts, so they made the glass concave on the bottom to make it hard to see through, which made them act like magnifying glasses at mid-day heating that hallway to in excess of 110 degrees, so they went over them with paint on the bottom that flaked off in the heat and got on everybody, so they covered them on the underside with a suspended rubber mat – resulting in a hallway so dark it was unnavigatable and everyone ended up using the back entrance – since with concrete ceiling and concrete walls, you couldn’t just run electric lights in there after it was built!
You with me?
We see these types of decisions in computer design, both hardware and software, all the time. Let’s go down the list of some real winners:
Hiding file extensions – Microsoft thought that file extensions were too difficult for users, again underestimating the intelligence of their user base. The result? Annakornikova.jpg.vbs – you don’t see the .vbs, well… bad things could happen. I WANT TO KNOW if a file is a .png, .jpg, .gif, NOT that it’s a ‘photoshop image’. I WANT TO KNOW if a file is a MSI, BAT, EXE, COM. THANKFULLY being windows you can turn that little feature ‘off’.
No tracking of a file type that is obvious to a user OR programs – TRADITIONAL *nix is a real winner for this one. You can have data files in fifty different formats, and have no clue which are which from running ls. BRILLIANT. Thank be that CP/M style file extensions and/or MIME types (or other equivalents) actually caught on (even if CP/M itself went the way of the dodo)
Spatial navigation with just the name of the local directory in the title bar… WHAT THE HELL? Am I in /boot/home/config/lib or /boot/beos/config/lib or /var/lib or E:/shareaza/downloads/sexychick/lib… In many spatial navigation systems, like say… BeOS, there are NO visual clues to that – so you better be REALLY paying attention to the five windows you had to open (and later have to close) just to get to that point… and even windows defaults to that behavior for the title bar (and corresponding taskbar entry) though at least it provides the address bar (and lets you turn that behavior off) (usually followed by my screaming at the display ‘JUST SHOW ME A ****** TREE!!!’)
You look at these things and go “Who the hell thought this was a good idea?”
But these are NOTHING compared to the current crop, as in most cases it was an attempt to make a functionality change, NOT goofy eye candy at the COST of functionality.
Which is where we are with a LOT of software today. Menu and window transition effects that take time for no increase in functionality(me, I click on something I like it to just happen, not have to sit there watching some STUPID animation)… transparancies that make the text in title bars and even the windows themselves nigh impossible to read… graphical task switching which is next to useless when you’ve got four copies of gedit and three terminal sessions running… Shall I go on?
We end up with piles upon piles of code to do all sorts of goofy skinning and other eye candy bullshit when freetype, OpenOffice and everything else font related STILL kerns text like a sweetly retarded epileptic crack addict. You want to know why OoO doesn’t catch on? Because half the people I’ve tried it on go “Why does all the text look like crap?” I t’s a spacin g i ssu e.
Or as a friend put it after seeing how OoO handled spacing of characters (aka kerning) “Yo, it be spacin’ G!”
I’m getting sick of seeing the focus on this stupid graphical bullshit instead of hordes of REAL underlying issues in operating systems and applications. Let’s completely redesign the UI when we cannot even secure the underlying OS. Let’s load it down with eye candy when half the stuff we expect a normal user to load from our package manager doesn’t even show up on their program menu…
ENOUGH ALREADY!!!
Edited 2007-03-31 03:18
>>For instance, you can build a door that is symetrical on the left and right, so you can’t tell if you want to pull the handle on the left or the right or push or pull to open it.
Funny, our university library doors are just like that. By looking at them, no one knows from which side (left or right) you have to push/pull. I saw lot of times perplexed faces . Eventually, over the period, my brain registered from which side I have to push/pull the door.
Now the things you said in your article make sense. I got a macbook few mnths back. It took me just a couple of days to figure out how to get stuff done on it. But it took me years on linux. For each new app, there is something new to be learnt. Just as you said in your article, when I started using gentoo, initially it was fun trying to edit config files. But with each upgrade, I had to edit them to get gnome running. Even the primary os on my desktop, OpenSuse gives me nightmares sometimes. May be its ok for software developers to spend so much time learning those things. But as a student working for PhD in physics, its a time waste. When I used to discuss the issues I had on linux with my CS friends, they used to ask me, “What are you working on, Linux or Physics?” Then I felt its a time waste. To give an example of consistency on macos, I blindly type cmd+, to go to preferences pane for any app.
I’ve been using computers since I was 5… in 1978. I design chips for a living. I do AI research as a Ph.D. student. The way Linux does it STILL confuses me.
You know, I do believe a person who designs chips and does AI research as a Ph.D. student does not necessarily have to be very clever.
Well, I also have been using computers since 1978 (though I was 8 at the time)…and I don’t understand how you can find the Linux way confusing.
In any case, if you stick to using apps in the repository, you’ll get them in your menu. Ubuntu (to name one) has a *huge* number of apps in its repositories.
If you compile apps that follow the freedesktop.org standards (and most recent apps do) you’ll also get menu items.
I think some of your criticism is valid, however I also think that some is outdated and/or exaggerated. There *are* efforts to standardize. There *is* communication between developers. Usability standards *are* being used by Desktop Environments. As for consistency, as others have pointed out it does not really exist in the Windows and Mac worlds either (Windows being the worst offender). KDE apps, for example, are *very* consistent. Again, freedesktop.org is making both main DEs converge.
The biggest reason Linux – and Mac OSX – have similar low market shares is inertia and games. People prefer the devil they know to the devil they don’t, and games has always been one of the driving factor of home computing.
I’ve been using computers since I was 5… in 1978. I design chips for a living. I do AI research as a Ph.D. student. The way Linux does it STILL confuses me.
And it’s all because Linux requires knowledge in the head instead of putting knowledge into the world.
Your argument is lame. I couldn’t design a chip or have anything to do with AI research even if you held a gun to my head, yet Linux hasn’t been a problem for me. It was only a matter of months after I first installed it that I was more apt with Linux than most Windows’ users are with Windows.
In linux, many programs put their configuration to /etc. Some programs also install theirselves to /usr. And they introduce /opt that is not available in “> man hiere”. Ah, sorry, typing that command in FreeBSD box .
“In linux, many programs put their configuration to /etc. Some programs also install theirselves to /usr. And they introduce /opt that is not available in “> man hiere”. “
Yes, I know this effect being a Linux problem. /etc is for system belongings exclusively. Same is for /usr except /usr/local (and /usr/X11R6). Local (added additional) applications have to use /usr/local/etc. /usr/X11R6/etc will be obsoleted, I hope. This is strict in FreeBSD, but in Linux there are many differences among the distributions. I do not exactly know which special purpose /opt serves… But that’s mostly what $PATH is for. 🙂
“Ah, sorry, typing that command in FreeBSD box .”
Just type “man hier” in a FreeBSD box or have a look at the http://www.freebsd.org/cgi/man.cgi?query=hier&sektion=7 manpage.
I mean, if I type in FreeBSD
> man hier
(tcsh prompt is “>”)
there is no /opt mentioned there.
Yes, $PATH is useful for executing program. But if I want to modify the application configuration, say apache, httpd.conf is in /etc, /usr/local/etc, or /opt/etc? In FreeBSD it is always /usr/local/etc. Use locate or find? So back to my original question, is this an excuse to scatter files over /usr, /usr/local, /opt, /var/opt, /etc directories?
“I mean, if I type in FreeBSD
> man hier
(tcsh prompt is “>”)
there is no /opt mentioned there.”
A, I see. Yes, of course there’s no /opt. Software installed via ports or packages resides in /usr/local (and /usr/X11R6), same goes for software that is not included in ports or packages and that the system administrator compiles and installes by himself. There is simply no need for /opt. But I’m sure you could introduce it, just adding it’s bin/ subdirecory / subdirectories to $PATH.
The standard csh prompt is %; /etc/csh.cshrc contains set promptchars = “%#” and set prompt = “%n@%m:%~%# “. 🙂
“Yes, $PATH is useful for executing program.”
And dangerous, because you could place malicious programs in it, so, for example “.” is not part of $PATH.
“But if I want to modify the application configuration, say apache, httpd.conf is in /etc, /usr/local/etc, or /opt/etc? In FreeBSD it is always /usr/local/etc.”
This is a problem typical to Linux, but I think (or at least hope) they will improve this situation so Linux becomes more appealing to server administrators who don’t want to search their configuration files.
“Use locate or find?”
No, first read the documentation of Apache or its modules and additions; they will clarify where the respective configuration files are located at.
“So back to my original question, is this an excuse to scatter files over /usr, /usr/local, /opt, /var/opt, /etc directories?”
To repeat it: Nearly each of these directories serves a special purpose for separation applications from data. This is intended, it is useful and it is secure.
While in FreeBSD this concept is a strict ruleset, it isn’t in Linux. So it’s not a general problem.
What you’re searching for is the PBI package system introduced by PC-BSD. Here, all applications are located in /programs, sorted by name, all in their own subtree. These subtrees do not only include all application parts itself (include files, libraries, shared data), it also includes all dependencies an application has. So you have an application completely held in a subtree, which you can copy or erase at all. This comfort you “pay” with a higher consumption of HDD space, but that should not be any problem today.
Remember, an application is not a single EXE file anymore. It consists of several parts that are different by nature. The OS tree layout has rules where these parts have to be located at. Would you stuff all your newspapers, books, sheets and everything that has the habit of being made of paper in a big paper box altogether? Surely you wouldn’t. 🙂
At last, the user does not have to know anything about this. Application managers do it for him. By definition, there’s no “scatter”, but as you pointed out, there is in fact, sometimes.
That’s true on the BSDs, but not true on Linux systems. There is no separation of “system” from “user apps” in Linux. *Everything* lives under /usr/[sbin|bin|lib|…], and the only stuff under /[sbin|bin|lib|…] is what you need to recover a barely bootable system. There is no “base OS” vs “user installed apps” distinction in the land of Linux. It’s just a giant conglomeration of packages.
A good sysadmin will realise this and use /usr/local to install their manually compiled apps, in order to keep it separate from the rest of the distro.
And the File System Hierarchy (at least last time I looked at it) is so full of optionals and “this or that directory” that every distro out there can claim to be FHS-compliant.
The BSDs got it right: a clear separation of “base OS” from “user apps managed by the OS” from “user compiled apps”. And a strict filesystem hierarchy to match.
Is this an excuse to scatter files over /usr, /usr/local, /usr/local/etc, /opt, /var/opt, /etc directories?
My system does not have a /usr/local/etc/ or /var/opt/ directory. Additionally my /usr/local/ is for distro specific stuff and there are no libraries in /usr/local/lib/. When I install something it generally goes into /usr/ unless it is a binary package in which case it gets installed into /opt/. My configurations files are stored in /etc/. That’s not so hard or complicated now is it?
I think the article mentions quite a few times to drop your ego. Why not try to see what benefit linux can get from adopting other approaches than simply saying “well windows screws up this way or that way…” If anything, the article is talking about Apple’s success.
No, Windows doesn’t handle install/uninstall well. However, the basic idea of a central location for an application is a good intuitive one. That’s where MS has been heading…but of course they’re held back by compatibility issues. The same goes for program settings. Apple learned from Microsoft and Linux and took the most reasonable path. Standard, but separate files.
“In Linux, the package manager will look after them for you.” Yes, and in windows the registry will look after your all your settings <sarcasm>. Sometimes things don’t work. Sometimes you install stuff outside the package manager. Sometimes I like to just like explore settings and replacing files. Developers like intuition too.
Intuitive UIs are absolutely needed. It’s not so much about consistency as it is about intuitiveness. For example, I recently installed Ubuntu on my laptop. I see the ‘start menu’ and I start to explore all the programs. Now I want to edit this menu. I right click an item to remove it…Nothing happens WTH. Why is this menu not responding to my right clicks. Oh, I have to go to a special program to edit the menu. Needless to stay, Kubuntu took over quite shortly.
Yes, all the Windows apps are rarely consistent, but they are largely intuitive. Consider MSN messenger. Everything is fairly intuitive with the exception of changing your display name (Why can’t u click and edit it and you can with your personal message text? )
> In Linux, the package manager will look after them for you.
In most cases, you do not even need a package manager in Mac OS X (or earlier versions of Mac OS for that matter).
Now when you create a program, you are doing so to solve a problem. That problem may include anything from “how can I entertain people,” to “how do I make system management easier.” The fact that most Linux applications need an installer while most Macintosh applications don’t need an installer implies that software installation is a problem on Linux, while it is not a problem on the Macintosh.
EDIT: reduced to one key point.
Edited 2007-03-30 22:42
“The fact that most Linux applications need an installer while most Macintosh applications don’t need an installer implies that software installation is a problem on Linux”
Huh…most Linux apps do *not* require an installer, they are handled by the package manager.
If you want to use the OSX method on Linux, you can, it’s called Klik.
I would say there is potential in applying uniformity beyond applications. Have you ever tried to compile something and configure tells you that you’re missing a library or a header? Maybe you ‘find -name’ your drive first, or maybe you just go to the internet right away and google it; but then it turns out the library was already there, but not within your $PATH? Or, maybe you have two different versions of some software installed in different locations and configure does not recognize the newer version? Experienced users will know how problem, but newer users will probably be mystified.
On package managers – if you use the package manager for some things and compile others by source the manager is usually not aware of your various outside installations. Potentially, we could hope, a simplified tree might make it easier for managers to identify installed software and react accordingly.
You mentioned your path… Novice users of linux will not understand $PATH immediately, and thus theirs may not be as inclusive as your $PATH. A simplified structure could essentially end the need for a PATH variable as there would only be one place to look normally (except where you have a specialized set up that necessitates a modified directory structure).
Macs are pleasant because they simplified those things. The directory structure may not be perfectly POSIX compliant or what have you, but that structure is more readily comprehended. Furthermore, simplified structure might make things easier for both user and developer when there are predictable locations for essential files.
“if you use the package manager for some things and compile others by source the manager is usually not aware of your various outside installations.”
Checkinstall will solve that problem.
In Windows, you have to be able to get at them, because Windows totally fails at providing any sort of install/uninstall framework, so stuff gets left all through your system that you have to clean up yourself. Or something decides to save it’s data under it’s Program Files folder, so you need to find it to back it up.
Your comment is quite dated. Ever heard of Windows Installer?
A WI routine can be incredibly powerful, and can handle scenarios that neither Linux or MacOS have yet! For example: Install on Demand, Advanced patch management (with uninstall support), per-user or per-machine installations, Distribution to specific users/groups over Active Directory, Reliable and complete uninstallations, Advertised shortcuts etc. etc.
The author explains the main problem of Linux IMHO: lack of guidelines. At least in UI world.
And that is because Windows users (not developers) have difficult to use and accept Linux (at least here in Brazil). A hundred of ways do to the same thing could be a good thing to an experienced user, but confuses the novice one.
“The author explains the main problem of Linux IMHO: lack of guidelines.”
There are guidelines which are interesting for you (as a developer) if you want to create a new project and think about how you will realize it. So it’s up to you to use them; it depends on desktop environment or toolkit you think is the best one for your solution. So your main problem would be: What is the best tool for this work? (The “one size fits all” tool does not exist.)
“And that is because Windows users (not developers) have difficult to use and accept Linux (at least here in Brazil). A hundred of ways do to the same thing could be a good thing to an experienced user, but confuses the novice one.”
So that’s why MICROS~1 invented so many “Windows” where all the things work different… 🙂
Linux is about choice. You choose the solution to solve a task that fits your needs. You like the GUI solution? You use a Qt based package manager. You know what you’re doing and like the fast CLI? You just “apt-get install foo” and have your work done. You want to advice a friend to install something? You mail him a package and two commands he can copy and paste; you don’t know to describe his “pictures” and don’t need to know how his desktop looks like.
Especially in Linux (and UNIX), there are some standard ways of doing things. They are basic knowlegde. GUIs usually are frontends and / or reimplementations of these basic stuff.
Yes, of course. Please read my post again, I said that hundreds of choices confuses the novice.
The author clearly states that
“Yes, of course. Please read my post again, I said that hundreds of choices confuses the novice.”
So the novice does not want / require / need choice, he needs some ways of solving tasks to be predicted and strict… Is it impossible to assume (or require) the novice user to at least have a minimum basic knowledge? I know, some don’t have, just have a look at http://www.rinkworks.com/stupid/ – especially the obvious http://www.rinkworks.com/stupid/cs_obvious.shtml – and you’ll know what I’m talking about. I just don’t want to imagine the novice user being that stupid that he cannot see the obvious and recommended way to do something. For example, most Linusi come with a package manager that is recommended to use and is obvious the correct package manager. Why bother someone with “./configure && make install” when the system suggests just to download and boubleclick, next, next, next, finish?
“The author clearly states that UNIX way is logical but arbitrary.”
This is correct and I do not disagree. I want to add that Linux and UNIX do not restrict possibilities. For a novice user, this should not be a problem. For example, PC-BSD comes with a package installer (PBI), but you can use FreeBSD’s ports collection and precompiled packages as well, but you don’t have to. My neighbor uses PC-BSD for over one year now, he even does not know about other software than PBI. So he could say: “I’m using PC-BSD, where applications are installed via PBI.”, logical and definite.
/* added obvious reference */
Edited 2007-03-30 22:34
“[…] I said that hundreds of choices confuses the novice.”
In a usual Linux distribution, there is one package management program that comes with the distribution itself. It is usually recommended to use this program, so I would argument: There’s one possibility, so there are not hundreds of choice.
(Next to the preinstalled software manager, you are free to use applications compiled from source or from a package repository. But you don’t need to, you even don’t need to know about.)
“The author clearly states that UNIX way is logical but arbitrary.”
This can be seen as an advantage, as long as there is a recommended / preinstalled GUI wrapper for the basic operations so the novice user does not get confused. As I stated in another post, the arbitrary ways are “taken away” by the standard GUI tool designed for each operation.
Exellent clarification about the difference between the frontend and system base.
I would say that Linuxes lag about 4,5 years if you think just about GUI. Technically they lag only 0-2 year, depending of the subject.
The system base part is what needs clarification. I can’t help it, but OSX feels like UNIX with better user interface. I do not need to do any of those hideous tasks with the command window, which are so beloved in the Linux/Unix community.
The problem is not the user base. They know their needs better than anybody. The problem is that the developers have no clue about user needs – unless they meet their own: the needs of a system developer.
I take the most common task as an example: the Application folder in OSX. Installation program helps the user to put the program there. There is sudo-like installation procedure but even that is properly GUIed, everything important but nothing more is shown. All the modifications for the program are made in the program, not outside it. When you want to get rid of the program, you just drag it off to the bin. The GUI is transparent and logical. You see the file structure of course, which is all good but that is secondary(!) to the job at hand. But maybe all that is just too simple for smart Linux people and you have your reasons… To calrify those reasons one could argument why this way of doing things is actually bad for the user and there are more controllable and user friendlier ways to do for example this basic procedure: management of programs.
Edited 2007-04-02 09:36
“I would say that Linuxes lag about 4,5 years if you think just about GUI. Technically they lag only 0-2 year, depending of the subject.”
In some regards, there sure is a lagging behind OS X, in others, Linux is far ahead compared to anything else.
“The system base part is what needs clarification. I can’t help it, but OSX feels like UNIX with better user interface.”
I could say (allthough it’s not very correct in a technical view): MacOS X is UNIX with a better user interface.
“I do not need to do any of those hideous tasks with the command window, which are so beloved in the Linux/Unix community.”
They are, because the CLI is fully programmable, a GUI simply is not. For novice users, GUI solutions surely are the best ones. For professionals who know what they’re doing, especially if it’s up to complex tasks, the CLI solves nearly every problem very fast, but you have to know how to handle it. This is special knowledge you cannot assume a novice user to have. Hey, a novice user even doesn’t know what pressing the “Eingabetaste” (Enter key) is! 🙂
“The problem is not the user base. They know their needs better than anybody. The problem is that the developers have no clue about user needs – unless they meet their own: the needs of a system developer.”
This is not correct. You cannot smash all developers together into a pot “system developer”; there are kernel and OS developers which I would consider to be system developers. Beside them, there are GUI creators, toolkit developers, application developers and even artists. A certain level of cooperation is needed to create a good program.
“I take the most common task as an example: the Application folder in OSX. Installation program helps the user to put the program there. There is sudo-like installation procedure but even that is properly GUIed, everything important but nothing more is shown. All the modifications for the program are made in the program, not outside it.”
This is what PC-BSD’s PBI packages do. They come with their own “next, next, next, finish” installers and do not require a package manager (explicitely).
In Linux, there needs to be one unified and universal GUI and and one package manager (implicitely), but this is not the fact because Linux users are very different in their needs, I think.
“When you want to get rid of the program, you just drag it off to the bin.”
And then, I empty the bin? 🙂
“The GUI is transparent and logical. You see the file structure of course, which is all good but that is secondary(!) to the job at hand.”
I agree. This is the concept of “optional complexity”, for instance: If I do not want (or do not need) to see details, they are not shown by default, but I may see them if I want to; “extension to the minimum” could be a good approach.
“But maybe all that is just too simple for smart Linux people and you have your reasons…”
No, this is not “too simple”. You need to specify what’s simple. One may argue: Is it simple to have a running X server to install a program? Is it simple to be in need of a mouse? On contrairy, is it complicated to enter “pkg_add -r xmms” on the keyboard to install an application and “pkg_delete -x xlockmore” do delete one? “Simple” always depends on the user itself.
BTW, I’m not a smart Linux people. 🙂
“To calrify those reasons one could argument why this way of doing things is actually bad for the user and there are more controllable and user friendlier ways to do for example this basic procedure: management of programs. “
I gave some examples. But finally, I agree again. Maybe, there could be a package (to be installed on any distribution), let’s call it UAIUM (unified application installation and uninstallation management) which can be added to every Linux, Solaris or BSD… just and idea… 🙂
Most people don’t know the difference between a website and a program. Yet they seem to be able to navigate complicated web applications like Flickr, MySpace and PhotoBucket just fine.
Why then do “usability” experts keep championing this idea of consistency? Users are able to consistently figure out the web as varied as it is. Doesn’t that provide definitive counter argument to this repeated claim?
While on the surface, many different web sites don’t seem to be consistent with one-another, they ARE consistent in some very critical areas. First, you have the basic things like links and web forms, and the browser itself is a constant that guides the user and restricts the design of the web pages. But also most good web sites, intentionally or not, do a very good job of “putting knowledge into the world.”
Two of the main principles espoused by Norman are consistency and knowledge in the world. If you have at least one, you’re in good shape, better if you have both.
I agree completely. Looking at the web sites I use regularly, on the whole they are quite consistent.
For example, most of my favourite online forums use very similar toolbars and tags, similar controls for replying and posting new messages, similar controls for displaying/collapsing threads, etc. When I meet one that’s inconsistent it’s significantly less pleasant to use.
Looking at the online stores that I return to regularly, they are fairly consistent when it comes to important things like searching for items and sorting results. Less consistent sites, at least the ones that aren’t highly intuitive, are less likely to see me as a return visitor.
As you point out, a lot of consistency is provided by the browser itself. Things like the handling of links, moving between form fields, selecting from lists, and the use of many other web page features depend on the browser, rather than the design of the particular site. Then of course there are the menus and keyboard shortcuts that remain consistent between pages, and to a large extent between browsers.
I wonder how the people who dismiss the importance of consistency would cope if their browser’s keyboard shortcuts were redefined and menus were reorganised?
I remember that so many people who tried Opera complained about having to press ctrl+n, rather than ctrl+t, to open a new page/tab, that the developers changed the default (despite the fact that users could redefine it themselves). It was an Opera ‘flaw’ that was mentioned in many reviews, at least those written by users of other tabbed browsers. Amazing how such a little inconsistency could be such a problem for a significant number of users.
I’ve notice that when it comes to complex, multi-step processes, many less-technical computers tend to just memorize their steps by rote.
It seems that the “holy grail” of computer usability is an interface where users never have to learn the specific quirks of an individual app – they just learn a set of basic principles once, and then apply them as needed. Of course, I don’t think that’s ever going to be entirely possible – for some types of complex tasks just require complex interfaces (otherwise there’s a sacrifice of necessary functionality). But it’s still a good ideal to aim for.
If I understand correctly, PBI files (the PC-BSD package format) are more like what the author wants. Everything gets installed in one well-known directory. It is very easy to install/un-install/upgrade apps.
“If I understand correctly, PBI files (the PC-BSD package format) are more like what the author wants. Everything gets installed in one well-known directory. It is very easy to install/un-install/upgrade apps.”
Yes, that’s correct. You “pay” this comfort with a higher HDD space consumption, but HDDs are big enough today. Dependencies are included, but you don’t need to know where anything is located because symlinks are set properly.
“Yes, that’s correct. You “pay” this comfort with a higher HDD space consumption, but HDDs are big enough today. Dependencies are included, but you don’t need to know where anything is located because symlinks are set properly.”
I have Kubuntu and PC-BSD installed here. I also like to try out other Linux distributions. But I always seem to come back to PC-BSD because it is easy to install find things when I need to. And that does come back to the authors point that Linux needs to make some changes if it wants to have a chance against Windows.
One of the concepts that most of take as very basic is the concept of files (and file systems). My wife still has a very hard time of separating files from the applications used to generate them. They are all the same to her. She also didn’t understand the concept of “saving the file”. Doesn’t the file know how to save itself?
My wife confused those concept at first, I though. Then I realized that she was disregarding an arbitrary distinction that we CS people are trained to make. What is the distinction between a document and an app? Only one thing: The number of levels of interpretation. An app is food for the CPU. A document is food for an app. And if you’re thinking that we edit docs more than apps, just talk to a programmer.
An app can also be seen as a “tool box” that only lets you use the tools on data you manage to squeeze into the box and it won’t let you bring your own tools.
Really, the whole concept of “applications” needs to just die.
That’s a good analogy.
But most of the time, we don’t really edit the app; we edit the doc (in the form of source code) which is then used by other apps (compilers, loaders, etc) to do the final processing work. Then the app as triggered by the user to be used by the machine is almost always a static thing.
Words are fun!
This type of confusion usually degenerates in explaining the heuristics of differing app behavior models instead of what the programs are actually doing to which files. I mean, MDI with a parent window is easy enough to explain, but even I don’t know what to expect when I close an MS Word window that tries to make you believe it’s going to act like separate instance.
“””
My wife still has a very hard time of separating files from the applications used to generate them. They are all the same to her.
“””
And you’ve been unable to get an RMA? 😉
as someone said if you install with a packet manager it is not a problem. however there are some 3d party applications that are nice to have ex eagle electronics.
even if it is not a commercial application and i have the source and do configure make etc this wont put the application in my gnome or kde meny.
if you are the average osnews reader you should be able to keep track of your applications. but if you are the average computer user it wont happen.
another thing is that this is a problem for companies that produces linux applications. if adobe would port photoshop wich distribution should they port to. they cant speen the resources to make it work on all, and if
average joe bougth photoshop for linux and could not find it after installation he would be pretty mad.
even if i like free software there will always be a need for some commercial applications if we wish that linux will be used by mainstream people
Interesting thoughts, yet I kept wondering if there was a bit too much ‘comparing apples to oranges’ feeling in the article.
Linux world and MS Windows (or OS X) follow a bit different philosophies. (In the ESR cathedral and bazaar terminology) Linux world follows the bazaar development model while MS Windows and Mac OS X follow the centralized cathedral development model. Both have their pros and cons. With a centralized model it may often be possible to get more consistent results, but the user is also always more restricted by the choices made by those above whether he agrees or not. In the bazaar model the user or a Linux distribution or a desktop environment project etc. are free to choose their own way. Although on the surfcae the bazaar may look more chaotic, nothing prevents projects like GNOME to provide users a very consistent user interface throughout all the applications.
When talking about the amount of consistency in MS Windows or Linux or Mac OS X, what are we talking about exactly? OS GUI? MS Windows is a graphical OS with a few extra apps like webbrowser and such. If you want to compare it to something similar in the Linux world, take GNOME, for example (not something too broad like Linux). I would say that GNOME has as consistent GUI, if not more, than, say, Windows XP.
There are tons of various sorts of commercial and non-commercial programs available for both Linux and Windows, and the GUI guidelines vary a great deal, of course, because it is all up to the individual projects, developers, various user needs etc. If a user, a Linux distributor or a system administrator wants to have a consistent GUI, he can have it, be the OS Linux or Windows or MacOS X.
As to file system hierarchies, I think that a system like the one adopted by Gobo Linux is very interesting. Some poeple may like the old Unix system better, however. Again, the bazaar development model in the Linux world, allows all those different models to coexist, and a user or a distribution can choose what seems best for them.
Alternatives are not bad. Linux world gives users/distributors/developers the priviledge of not being restricted by a single OS “dictator”. In the end, may the user have the choice of how and what tools to use.
True to a certain extent, but if you want a relative level of consistency equal to Windows/Mac OS X on your Linux installation, you have to heavily restrict the applications you use.
Personally I find it very difficult to find Linux applications that’ll do everything I want, without mixing and matching applications with highly inconsistent user interfaces.
Of course the consistency of applications on Windows and Mac OS X is far from perfect. Microsoft in particular sometimes completely ignore important parts of their own UI guidelines, often for no reason that I can determine. However, those flaws don’t meant that Linux consistency is automatically on a par with those two operating systems.
Linux is certainly getting better. It’s not long ago that copy and paste of data between apps was very unreliably if not restricted to plain text, while now it generally works reasonably consistently. Unfortunately, when running a pretty typical mix of different apps, covering most common desktop computer tasks, you’ll still probably end up with something much less consistent than a typical Windows or Mac OS desktop. Maybe not when it comes to visual consistency, but certainly when it comes to consistency of functionality.
I don’t think that consistency is the only thing that’s important. A highly intuitive application can get away with some inconsistency, and sometimes it’s necessary to ignore guidelines; especially when implementing features that weren’t envisioned when those guidelines were drawn up. However, that doesn’t mean that consistency isn’t important, or that it can’t make one application more pleasant and productive to use than another.
In my opinion, greater consistency is definitely something that Linux developers should be striving for, at least if they want Linux to succeed on the desktop. I don’t think freedom necessarily has to be sacrificed for Linux to offer that consistency.
“In my opinion, greater consistency is definitely something that Linux developers should be striving for, at least if they want Linux to succeed on the desktop.”
Agreed very much. But Linux is just too broad a word to use in this context. Linux is a kernel? At least not a GUI? An OS with various desktop environments and GUI choices?
I can understand that sometimes the development of Linux as a desktop OS can seem disturbingly slow when you compare it to something like Mac OS X.
Let’s admit it, easy to use desktop Linux distributions meant for average Joe / your nextdoor neighbour are quite young and new thing still. For many years and not so long ago, Linux was mostly a geek OS for IT-competent people who liked to get their hands dirty in configuring things. Naturally there is still lots of stuff to be developed before all things reach the same level as in some commercial operatings systems like mac OS X developed for easy desktop use from the start. But things are gradually getting there in Linux too, and the ease of use of something like Ubuntu Linux is already surprisingly good.
As an example, I have a friend, who had endless problems with MS Windows 98, so in the end I installed Ubuntu Linux for him so that he could choose either Win98 or Ubuntu. He is not too competent a PC user but gets along with Ubuntu quite well and almost never boots into Win98 anymore.
I disagree with this assertion. My Linux laptop runs a typical mix of apps, and they are much more consistent than what I see on my Windows PC at work.
The reason for this is that Linux developers *have* strived for more consistency over the past couple of years, why Windows developers still don’t really care about it.
Personally, I don’t think consistency of apps has *any* impact on Linux adoption. IMO it’s mostly a Red Herring.
Agreed; I wouldn’t have an issue with new interfaces, as with the case of Office 2007 if it was adopted by *every* application in the suite, Word, Excel, PowerPoint Access, Outlook, Publisher etc. If you’re going to use an interface, use it right through your product line up, don’t just pick and choose – it ends up looking like a half-assed ameteur attempt at experimenting with user intefaces.
True, the biggest impact isn’t consistency but the lack of certain applications – and normally it is only one – 99% of what they need is provided by alternative applications, its just the niggling one or two applications that upsets the apple cart.
As for hardware support, which is normally used as a red herring; it can’t be any worse than the amount of downloads I had to do after installing Windows Vista – this is not an attack on Windows Vista, but simply to point out that Windows is no walk in the park.
Generally speaking, if you have an Intel chipset plus processor, nvidia graphics card, and the Intel HD audio card, along with the usual grab bag of stuff, it’ll all be supported out of the box by any mainstream distro like Ubuntu or Fedora, without any problems.
I wouldn’t go so far as to create this divide between consistency and freedom. In fact, rarely in the computer world is consistency enforced to the point of taking away freedom.
Take Windows. It’s far more a bazaar than you seem to indicate. Yes, there are ‘best practices’ but in windows, you can pretty much do what you want.
The registry is there. But you can also use any settings format you want. Go ahead and make your own proprietary text format file.
Windows Explorer is there. But you’re free to replace it with your own shell.
There’s standard window looks, but you are free do have you application look and behave as you choose. Go ahead put edit commands under the file menu. Whose going to stop you?
Choosing to be consistent does not take away any freedom.
“Alternatives are not bad.”
—————————–
Yes and no. Too many alternatives are definitely bad, because the user becomes so spoilt for choice that it is impossible for them to rationally evaluate all of the alternatives and make meaningful decisions.
Case in point is the stupendous number of Linux distributions, most of which differ only in trivially insignificant ways, and while the end user has a ridiculous amount of choice, most of those choices are really quite empty, especially since one distribution of Linux+software can often be configured to behave like another (though more often than not this requires a bit of effort and skill).
It also becomes a problem for a developer wanting to develop an application that works in harmony with other applications – unless there is some kind of standard for application interaction (which, sometimes there is, but quite often there isn’t, the clipboard functionality (or lack of it) of Linux being a good example), since the larger the number of choices becomes, the more difficult it is to support them all.
This often results in the user wanting to choose a couple of features for their OS, but can’t because of incompatibilities in those choices (such as KDE apps not behaving or looking right under Gnome, or a distribution that has some features you like, but an app you like is not in the repositories, and is a bitch to install and get working by other means).
Choice is a wonderful thing, but like all things, it is only good in moderation. Taken to excess, it transforms from something nice, to something quite deleterious (it is nice to have a bar of chocolate every once in a while, but if you eat it for breakfast every day, you have a problem).
“Linux world gives users/distributors/developers the priviledge of not being restricted by a single OS “dictator”. In the end, may the user have the choice of how and what tools to use.”
————————-
The freedom that Linux provides users is both its biggest strength and its greatest weakness, simply because like choice, freedom taken to excess goes from being a good thing to a destructive and counter-productive thing. Freedom needs limits to have any value – you can’t have the yin without the yang. Freedom needs to be balanced with control and focus to achieve a valuable end result.
The author of the article is spot on.
“freedom taken to excess goes from being a good thing to a destructive and counter-productive thing.”
Yes, a common refrain. Quite wrong. In a free society there is only as much choice as people create by exercising it. It is there because they want it. There will always be people though, whether in books, papers, elections, religion, computers, who want for some reason to restrict this choice. For our own good of course.
The argument amounts to: I do not like this much choice, therefore you should not have it. Or is it perhaps, I do not like what you are choosing. Therefore you should not be able to choose it?
“Yes, a common refrain. Quite wrong. In a free society there is only as much choice as people create by exercising it. It is there because they want it. There will always be people though, whether in books, papers, elections, religion, computers, who want for some reason to restrict this choice. For our own good of course.”
——————
Bullshit. Of course freedom and choice needs to be restricted. Should you have the freedom to drive on whichever side of the road you feel like? Should you be able to choose whether your next door neighbour lives or dies? Should you be free to release toxic waste into a city’s drinking water supply?
Of course not – these freedoms would be destructive and counter productive to society. A free & democratic society does not mean each person gets to choose whatever the hell they want, it means they are free to participate in some level of the lawmaking process which determines the limits and boundaries of individual freedom. A lot of the “Freedom is everything” brigade seem to miss this important point.
Life is about dealing with and making the most of the limitations and restrictions thrown at you.
“The argument amounts to: I do not like this much choice, therefore you should not have it. Or is it perhaps, I do not like what you are choosing. Therefore you should not be able to choose it?”
————
No, that is not the argument at all. You don’t seem to understand how an excessive range of choices renders the act of choosing as nothing more than a random, empty and pointless act, because it would take a human being far longer than their waking lifetime (or far more time than is worthwhile) to be able to gather enough information on the range of possible alternatives to make an informed decision (and by the time they do, the choice they make may well be obsolete).
Too much choice and freedom is demonstrably bad for individuals and societies. This is why the vast majority of human beings will generally ignore all but the 3 – 4 most popular options when faced with having to make a decision*, and it is lamentably the same reason why Linux only makes up a minuscule percentage of installed desktop operating systems (apart from the fact that very few people actually care what their OS is).
*where there are a vast number of equally popular choices, or options with little to distinguish them, people will either choose the cheapest option, or pick at random.
“””
Of course freedom and choice needs to be restricted. Should you have the freedom to drive on whichever side of the road you feel like?
“””
You’re not from Dallas, are you. 😉
I’m from Dallas and I agree with him!
I think there is a small difference what we are talking about here. One is what the user experiences when working in or on a GUI – is it somehow comprehensive or not? It’s not about the one’s that are able to tweak a system or configure it maybe by console. I began my computer experience with Mac OS 7 and went through all of them to Mac OS X. For me it was a very pleasant Desktop feeling and very logic. Windows always confused me. I’m using Linux now for years happily and I think that the effort you put in learning to deal with an OS is the same everywhere, but what you get is different. What I experience at work is that normal users are frustrated in everday use of Windows. So they could invest there learning effort in another OS much better and maybe with a little more positive effect.
What kind of article is this. Its not clear. I’ve read it…and its a difficult read. Other than the directory structure. I’m not sure what comparisons are being made. It ends as some vague ramblings. It also lumps everything from the kernel to the desktop to the file stucture in one place, and that introduction. I’m a “Linux” user awful. The weird things is it talks about *standards*
is it saying
1) I like applications and config files in the same place.
2) When I update a minor revision I want my config files to keep working.
The directory structure is *standard*. I know and I’m sure the author knows that the multiuser config files are in /etc and the users files are normally in users home directory under .something. Its a standard. I like this way it makes sense esp when there are multiple users of one machine, by its very nature its more structured than the win.ini /windows /program file/ something or other probably under a *company name* maybe in the registry setting.
I do not know what to say about the second but it seems like a complaint both against KDE and a particular problem the author has…it should be a forum post and I wish he would have provided it as reference. I have edited 6 config files on gentoo in the last 2 years. they have been xorg.conf which has been a disgrace of X for a long time, and had promised to have been fixed for a long time. My firewall ipkungfu, but there are alternatives with a graphical front end…but habits are hard to break, but I do think firewalling is a disgrace for a new linux user. Samba and there are graphical interfaces although I’ve never found one I liked. And the gentoo portage .keywords .use .unmask simply because I like trying new stuff before its gone stable.
This article is a rant, and not a good one. I do think *standards* needs a proper article on here esp with the good work done at freedesktop.org with their various specifications from icon naming and menu entry specifications.
There is nothing “standard” about .something in the user’s home directory. For one thing, it would be more sensible to store them in their own directory so that they don’t clutter the home director. Secondly, and more to the point, they all use different formats…. there is no standard.
Consider, however, Apache configuration. You configure it in files in /etc. Where those files are located has moved around from version to version. Installing Apache and PHP together doesn’t always let Apache know that PHP is there, and every upgrade to either one causes settings to get lost.
“There is nothing “standard” about .something in the user’s home directory. For one thing, it would be more sensible to store them in their own directory so that they don’t clutter the home director.”
Don’t you think directories like “New Folder 1”, “New “New Folder 2”, “New Folder 3” … “New Folder n” do clutter a user’s home directory more than hidden .foo/ directories the user does not even notice?
There is a certain reason for this: Different users may have different settings, and they can edit them by their own. (To edit them, they do not need to access the files itself; the respective program does it for them.) With “their own directory” you’re refering to the directory the program is installed to, right? This would require the user to have write permissions to this directory. This is fine as long as the program is installed locally to the user (inside his home directory), but gives problems for centralized installed programs where one wrong click could destroy the whole installation. At backup time, files ~/.* are backed up (backupped?), but /usr/local/share/foo/* isn’t. Centralized individual configurations (attention: contradiction!) would be lost. So, in my opinion, this is not a good solution.
“Secondly, and more to the point, they all use different formats…. there is no standard.”
As I tried to point out, this is irrelevant. The configuration is read and written by the respective program, it knows the format. The user does not have to, nor has another program.
“Consider, however, Apache configuration. You configure it in files in /etc. Where those files are located has moved around from version to version. Installing Apache and PHP together doesn’t always let Apache know that PHP is there, and every upgrade to either one causes settings to get lost.”
I see this problem. Maybe you can encounter it in Linux with the /etc vs. /usr/local/etc mix. In a more tidy system (just to mention BSD), where all these files are located in /usr/local/etc, the problem won’t exist. But I can’t tell for sure, I never encountered such a problem, but I know it sometimes exists.
“There is nothing “standard” about .something in the user’s home directory. For one thing, it would be more sensible to store them in their own directory so that they don’t clutter the home director. Secondly, and more to the point, they all use different formats…. there is no standard.”
I can’t but I’m pretty certain that that the . is to keep them hidden so they *don’t* clutter the directory, although I see little reason not to have a etc/ directory although I think its a minor point. They are in there own directories if they contain several configuration files although I think this has more to do with common sense than a standard.
I’m actually quite intrigued by you saying that .config files use different standards. I don’t think of simple text files as needing a standard. I can’t even think of a standard that exists. It would be nice if you could do an example becuase I’m not really sure if there could be a “single solution” simply becuase I don’t think its appropriate.
“Consider, however, Apache configuration. You configure it in files in /etc. Where those files are located has moved around from version to version. Installing Apache and PHP together doesn’t always let Apache know that PHP is there, and every upgrade to either one causes settings to get lost.”
I have to be honest here I have never configured those. Although I’m pretty certain if I was using my computer as a web server. I would make sure that my PHP and Apache knew about each other, and damn sure that I had a copy of my original config files.
I’m not sure about how your package management works. I do know how gentoo works, and I the option to *merge* it will keep *my* changes, although I *used* to lose fstab on gentoo, but don’t anymore, and I lost a file in my firewall, but if you update a program with changes to the config file, that file *should* be overwritten thats correct behaviour.
I’m sure you have a point here, and I don’t understand it because I don’t see the problem, or a possible solution to what you see as a problem, but I do think you have a point I just can’t get my head around it.
Edited 2007-03-31 00:23
What kind of article is this.
It’s a personal blog on livejournal. The author is just writing out personal opinions, so it’s not professionally written by any stretch of the imagination.
I’m getting really tired of all these old complaints that people are still using but that don’t really apply anymore/never applied!
One example is the whole consistency-thing. Neither Microsoft’s nor Apple’s programs are consistent at all, so using this as an argument for why “linux isn’t ready for the desktop” is BS. Your going to have a hard time finding apps that are less consistent than MS’ Office/Media Player. As BS is the complaints about linux’ folder layout. In a normal modern linux distro you will never under normal circumstances need to go anywhere else than your /home/username, so who cares how it looks.
In a slightly different note you have all those people who have kde’s use of knames as their primary reason for not using kde. Get real, that’s no valid reason for not using something. And besides Apple and MS do it to with their iNames and “Windows Appname”.
It seems that these opinions once they have gained a foothold will never go away. As a kde fan i often notice the same complaints come back in every kde thread, and im sure its the same way in most other threads. Some of them are fair complaints but a lot of them just uttered by people that feel that way because they have heard someone else complain about it before.
Actually the article seems to be comparing the classical cathedral development model (strong centralized leadership, standards, coherence, one direction) to the bazaar developemnt model (choice, freedom, competition and interaction of various opinions). Both developemnt models have their pros and cons. Maybe instead of fierce confrontation, those different camps could learn something from each other?
As to GUIs, if you want to compare Apple’s Mac OS X or MS Windows (probably their latest versions) to something in the Linux world, you should then compare them to, for instance, Ubuntu Linux or to Redhat’s or Novell’s newest Linux versions. Otherwise the comparison wouldn’t be on equal grounds and would be like comparing a certain kind of apple juice to certain kind of fruit gardens or gardening methods. Just saying “Linux” is too broad a concept to be used in such a comparison, especially as you could as well say BSD, as the various BSD flavors share the same desktop environments with Linux.
I agree with most ideas exposed by the author. Sadly I think the ‘ego’ problem might be the hardest to solve.
Actually, the Windows registry existed in Windows NT 3.1, 2 years before Windows 95.
Windows has evolved how the registry is stored from having it in 1-2 huge files that stored everything, to (with Win2K3 at least: I’ve not used Vista yet, so I don’t know if there’s a change) having it distributed more evenly and localized to smaller files associated with applications, and these files are typically not shown by Windows Explorer. This helps (in theory) with at least making it easier to move or restore settings for given applications, but there are still system-global settings that aren’t in that location that can be scrambled either by a file corruption from hardware failure, or from an application writing things to places it shouldn’t, which may be an issue of running software as Administrator, at least in most cases. A huge reason for the registry to exist for everyone to read, as well as a way for things to all too easily get corrupted to needing a reinstallation is due to COM, as COM uses the registry extensively. This is something that’s not mentioned in the blog entry, but is a significant difference between Windows and both Linux and OS X: neither one of those two platforms have COM and all that entails. Because of the heavy use of COM in versions of Windows later than NT 3.1, this is actually likely to be the most fragile and hardest portion of the system to restore to correct operation if corruption/data loss takes place for whatever reason.
As long as non-system/Admin permissions applications have access to write to the registry that the entire machine accesses from all software, it still is a bit fragile, despite the distribution to smaller more local files. I haven’t investigated exact details of how its implemented, but the OS X way of dealing with the issue sounds like a better method than Linux or Windows does, at least when it comes to preventing there being a single point of failure.
“Actually, the Windows registry existed in Windows NT 3.1, 2 years before Windows 95.”
There already was “regedit /v” in “Windows 3.1”, if I remember correctly, but I’m not sure if it’s the same. I remember a kind of hierarchival view of keys and values… soory I can’t be more exact, I’ve just seen it one time many years ago.
According to this link: http://www.tech-pro.net/intro_reg.html apparently we’re both slightly behind: it states it existed in Windows 3.0
However, I suspect only Microsoft applications (Office) and those that worked with Office used it that early. I couldn’t remember using the registry editor in Windows 3.1 (not NT) as .ini files still roamed the hard drive in herds
Linux is fine if everyting works on first boot after a new install. However, I’d rather chew off my own right arm than have to tweak a Linux system, for whatever reason, that doesn’t configure itself correctly on install.
Linux does fall behind in this area IMO.
At least in my home country (and in many other places) Apple seems to be less and less important player in the OS field while Linux keeps getting stronger. So it seems that Apple doesn’t get everything right while Linux does in many fields. And it is not just because of the high price of Macs in the EU (though that is a big issue too).
Apple Mac OS X is sure one of the finest operating systems in the world, but it may not suit everyone and every purpose. Linux, with all its various distributions offers choice. The software is open source too so you or your favorite distributor could actually change it to your liking, what ever the preferences. The various Linux distributions (and application developers) do just that, offer customers the choice. If you want consistency, ease of use or a mac OS X like GUI, it is available in Linux world too. Just choose the right Linux (or BSD) distribution that suits you.
Edited 2007-03-30 23:34
It’s not because Linux does things right and MacOS X does not. It’s because Linux is free and runs on stock hardware, while MacOS X needs Apple’s proprietary hardware to run.
Real-life example: in our company, we wanted to set up a development CVS server. There was a spare machine around, a pentium 3 with 512 MB RAM and an old trident 32 MB graphics card. We chose Linux, run it in console mode, and it is very nice, fast and does a fine job. It costed us almost nothing.
On the other hand, if we wanted MacOS X, we would have to pay a hefty price for a Mac, and still would not have all the tools Linux has.
Gnome and KDE are really the UI level in Linux, and these are far more consistent that anything I’ve seen in Windows. Above that, Tango and the Freedesktop.org standards like DBUS has more steam than ever, and that brings a level of consistency even above those separate desktop environments. Complaining about lack of X11 consistency is silly. I suppose Motif is the area for that, if you really want to target X11 (though Java has taken that level for UNIX desktop apps pretty much).
I think that this article makes some really good points, especially regarding the Application Directory issue.
It’s about elegance and clean design. The internal design of Linux, with files strewn here and there, seems like a mish-mash, design by committee style standard, and the Windows method is also flawed.
The guys on here complaining about the article just have a different mindset, like the author mentions about the designers of Windows, they just don’t ‘grok’ what he is talking about.
But if you are not a ‘geek’ with a techie mind, and are an artist or a designer, you should understand. It’s about reducing cognitive load, making system maintenance easier for ‘normal’ users, and making things more logical and standardised. Its about caring for and taking pleasure in elegant designs and solutions.
PS — Yes, RISC OS did have it before the Mac, AFAIK.
Edited 2007-03-30 23:54
Blah, Apple MacOS X market share outside USA is ridiculous. In developing countries even linux has bigger market share.
I agree with some critics of this article but the fact that Windows is dominant is the proof that even ugly things like the stupid and non-intuitive registry cannot affect OS adoption by users.
Linux has better potential to compete with windows even with some usability problems. These are being improved quickly at each version. Even the supposed MacOS Xsuperiority in 3D effects are now both in Linux and Vista.
I also pretty much agree with the article, the OSX way seems to be the most intuitive way to go, forget Windows registry, its a royal pain sometimes for redoing std app installs while for others its barely okay. Ubuntu install okay for initial install, not looking forward to changing it.
on a side note
I also see that the FPGA graphics board has come a long way too, just a bit of sticker shock, I want one but not likely able to justify it. This card would likely have the best video output specs I have ever seen on any FPGA board with graphics. 2x DVI + Analog all at post 2048×1500 is awesome. Most development FPGA cards barely do VGA at 4bits.
Yet another article about trying to pound square Mac/Windows pegs through round Linux holes.
There’s a fine line between improving usability, and dumbing down a system. Can Linux improve usability? Sure. But when you sacrifice it’s technicle edge in the process, what good is that.
Sure, you can do a file layout similar to Mac/Windows. But in the process you loose things like major security benifits or the ability to mount specific directories over the network or the options available with chroot, and the list goes on and on.
Just because you don’t understand something doesn’t mean it’s wrong. It means you probably didn’t major in computer science.
So if you wanted to, say, make a Linux distro launch an application by clicking on its folder, what problems might that give for security and flexibility?
Also, where exactly would you put these folders? What about system apps vs. “user-apps”? What if certain apps need more rights?
One thing I do agree with, is that for applications installed in a home folder, they shouldn’t automatically put everything in the /home/user folder… Right now I’ve got a Picasa folder, and a logs folder for Konversation etc… next time I’ll try and install all my apps and “app folders” in ~/bin , that’s for sure!
I do hate applications cluttering up my home folder like that. And even though I’m not new to Linux, I don’t have a clue how to remedy that, short of creating hidden files that are symbolic links to the new location (if that’s possible).
But mostly apps.
UPS-Worldship, Quickbooks, and many others, work on both windows, and mac, but not linux. Same with wireless cards, and lots of other hardware.
Think about what an operating system is for: it allows your computer to work with your apps, and your hardware. If the OS doesn’t do that, what good is it? What your gui looks like doesn’t mean a damn thing, if you don’t have apps and drivers.
It’s a shame that linux developers don’t get that. They always want to make another dozen GUIs, and another hundred distros. But, the developers just don’t understand what end users really need.
“It’s a shame that linux developers don’t get that. They always want to make another dozen GUIs, and another hundred distros.”
This is not true. There are lots of GUIs, at least toolkits and the two major desktop environments, KDE and Gnome. They are developing, maturing. Especially in the last years, almost no new GUI has been created, but lots of GUIs have been improved.
Just to repeat it (I imagine you don’t like to hear it): Linux is about choice. You select what you want, you decide what you use, what fits your individual needs. Therefore, you will have to know what you want, or you trial & error.
“But, the developers just don’t understand what end users really need.”
Please differ well: What the user wants is usually different from what he needs, and it is different from what he thinks he needs. In most cases, the (novice) user does not know what he needs.
In everyday CS world, you mostly have a 3 stages model:
1. What do I want to have in the result?
2. How do I get this result?
3. What tools do I need to do this?
No. 3 requires application programs to be used. They need to have a GUI that Joe Q. Sixpack is able to handle. It may not be too dumb, but not to complex as well. So a good GUI is very important for an application. Both KDE and Gnome try to reach this goal.
What is the first thing a potential user looks at? Yes, it is the screen with the pretty pictures on it. Imagine you have a system that supports every piece of hardware on the world, and every possible software is available for free. But the system does communicate via command line only. Would anyone use it? Obviously not.
Espacially novice users get confused seeing an 80×25 text mode screen. “Eh, this is DOS!” is a usual reaction, even if it’s not DOS, but Solaris, BSD or Linux.
The first impression is the most important one. Well, that’s not fair, but that’s the way it is. Just read a few introduction papers on perception and cognition psychology and you’ll find this fact illustrated.
People like OSes and GUIs with many colours, shiny squeaking buttons, dancing elephants and music, even if it crashes all the time and needs patches twice a day. 🙂
About drivers: Blame the hardware vendors for not using existing standards and for not giving sufficient information about how their devices can be interfaced. Don’t blame the OS developers, it’s not their fault if some hardware does not work.
About applications: Blame the software developers. For Linux, all the specs are free and (more or less) well documented. The same goes for other UNIXes. If they want to satisfy their customers who want to have a certain solution for Linux, they will develop one. Don’t blame the OS developers. It’s not their fault of some software is not available.
At last, be sure to know what developers you’re talking about: System developers, UI toolkit developers, framework developers, interface developers, application developers… there are many places in software business a developer can work in. He usually is not responsable for another field. Let’s say, someone develops a financial management system based upon KDE and Qt. He even does not have to know how Qt works internally. So he’s not responsible if someone does not like the colour combinations in KDE or the way the Konqueror looks like.
What your gui looks like doesn’t mean a damn thing, if you don’t have apps … It’s a shame that linux developers don’t get that.
Huh? Linux has a ton of apps, too many some people say. This is because Linux devs saw a need and created them. If a proprietary app that’s not under Linux devs’ control doesn’t work with Linux, how on earth can that be blamed on Linux devs? Even then they do there best (with Wine and whatnot). The blame lies elsewhere.
The end result about usability for someone who needs a certain program is the same, but blaming it on the Linux devs is ludicrous.
“UPS-Worldship, Quickbooks, and many others, work on both windows, and mac, but not linux. Same with wireless cards, and lots of other hardware.”
I’m sorry, but your argument is flawed. There are lots of Windows apps that aren’t available for Mac, and Linux is compatible with *more* wireless cards and hardware than OS X is.
Hardware compatibility is a very bad example. Linux supports much more hardware than Windows. Even if your restrict the comparison to the few harware platforms Windows supports (namely Intel-x86 compatible). It’s true that there are devices without Linux support on x86, but the converse is true in that there are many pieces of hardware for the x86 architecture supported under LInux without any Windows support.
If you look at EM64T, for which there actually is some Windows support, Linux has far more extensive hardware support than Windows Vista 64 or XP 64. I doubt XP 64 will ever have any reasonable leve of hardware support.
I agree with you and think that Linux has it’s place and yes it may work for simple tasks but let’s be honest it is not a drop in solution for the desktop both Redhat and Novell even admit that.
OS X main problem if you consider it a problem is that iyt is tied to hardware. The truth is that you can drop it in and run the applications that users are familar with. I contribute to Linux and recommend it for solutions however a lot of people forget that it is not about alternatives but sometimes about being able to use the applications that they are familar with or will use elsewhere. (Quicken, Dreamweaver, Photoshop, Quickbooks, Quicktime, office-yes I know openoffice works)
Also the upgrad on Linux with the dependencies are not really there for enterprise development. Yes you can write scipts and run cron jobs to upgrade yes you can use zenworks. but for an admin standpoint this can be a problem when you have different versions of software.
I likie Linux on my servers, but I have been appreciating OSX server as well. The main issue I have is that Linux and open source are to separate things that is what people forget. You can love OS X and still develop opensource apps. We keep talking about freedom as if OSX does not supply to that school of thought. Mixed source is actually a business model. I know some may not agree, and think that everything has to be free, but mixed source again is a model. I think we get caught up in the benefits and ideaolgy behind Linux that we can’t see the benefits in anything else.
Ok you are free to try throw me to the wolves now.
Its nice to talk about these things and compare them to other platforms, while not comparing other features of other platforms to each other.
1. Solaris fastest, linux/osx fast, windows (especially vista) slow.
2. Solaris safest, linux/osx safe, windows unsafe
3. OSX easiest, windows easy, linux/solaris difficult
4. Windows the most common, osx quite common, linux/solaris the least common
5. Solaris the best CLI OS, Linux/osx v good CLI, windows without powershell the worst CLI OS
6. Solaris the most stable, linux/osx very stable, windows stable only if server/well configured
7. Linux/solaris the cheapest (free), windows acceptable/questionable sometimes (server), osx very expensive(due to hardware tie).
8. ….
happy computing
Are you kidding me? Solaris is not safe. It had a freakin telnet vulnerability that went worm. What kind of modern OS shipps with telnet enabled first of all?
Secondly, it ships with sendmail as an open relay. Solaris is only approaching “safe” after hours of work.
Linux/OSX/Vista ship fairly safe out of the box with no additional work.
Why should users need to know where applications are installed. If they really need to know, Linux have package managers that are far better than the MacOS-X folder system at telling where files are located, what they are used for, and what dependencies they have.
Not to mention that that the Linux package manager systems makes it much easier to install new apps. No searching on the internet, no unstuffing, just check a checkbox and click OK.
The idea of storing everything in one directory have serious disadvantages.
First of all, the backup needs for e.g. executables and configuration files may be different, another problem is that many applications share the same libraries. You also have to realize that there are a lot of applications in MacOS that are not stored in one folder but are scattered all over the file system just as on Linux. The difference is that on MacOS there are to my knowledge no package manager to keep track of it all. Just try to remove cyrus-imapd on a MacOS-X server and you see what I mean.
Another thing what about large mixed platform environments where you get /usr/bin /usr/lib from file servers (different server for each hardware platform) but share the same settings in /etc and the home directories of your users. The Mac way of doing this doesn’t scale as good as the Linux way.
As for the clipboard problem, I don’t know what the author expects from his clipboard, but as far as I can tell ordinary cut and paste works fine between nedit and gedit in Gnome in both directions. Just like cut and paste works fine between Gnome and OpenOffice.org or Gnome and Firefox. I wouldn’t say Linux have more problems in this area than other desktop environments. If it doesn’t work it is a bug and should be reported as such.
Are you talking about cut/copy/paste of plain text, or the transfer of images, formatted text, and other content between applications?
Using the clipboard for plain text has pretty much always worked fine in Linux (or the X Window System to be more accurate), the same isn’t true when it comes to other types of data.
Until quite recently cut/copy/paste of non-plain text worked less well in Linux than it did in Windows 3.1, or even on a 1983 Apple Lisa. Things have improved, but in my experience it’s still a lot less consistent and reliable than it is in Windows or Mac OS X.
If I install a new DTP app, word processor, graphics app, etc. in Windows or Mac OS X, I can accurately predict what will happen if I copy a selection from a document/image, and paste it into another application. The same still isn’t true if I install a new app in Linux. It’s still a nice surprise if it works perfectly, when it should be a rare and unpleasant surprise if it doesn’t.
Configuration files are great. I have the same configuration files for xresources (.Xdefaults), gtk and IceWM for years. I used them on Linux and now on FreeBSD. I don’t have to configue anything any more.
No registry, please, let me keep my configuration files forever. I have a copy burned on CD.
Traditional UNIX copy&paste works much better for text than anything else I’ve seen.
I like old, traditional, UNIX and X ways. I set wallpaper via xloadimage, I use xcalc, xclock and xlockmore or xscreensaver . They are tried and true, and perfect for me.
DG
As a Windows XP user (for five years and expecting at least that much more), most or all settings I worry over nowadays are in a per-user config directory or otherwise easily accessible in the app’s install or working directory; e.g., one-off ini and xml configs that are easy to use and find. I have had to restore or copy whole browser profiles (mostly from Mozilla nuking the data itself), surprisingly-complicated text editor syntax highlighting color rules, home server config, media player associations/behavior/skin configs, and various logs, private keys, and macros. Some of those apps are designed with the UNIX/Linux model in mind, and some are Windows-native and -only.
They all seem to work the same when they work well.
I wholeheartedly agree! Unix ways are truly efficient and nice for getting things done. Every time I have to use Windows machine (or heavy “modern” Windows/Mac -emulating desktops) I’m amazed how inefficient and limiting they are.
I played around with Gnome and KDE for a long time. Then I became tired to all that and switched back to Fvwm (version 2.4). Fvwm was the first window manager/desktop I used ten years ago when I installed Linux (Slackware) the first time. The moment I installed Fvwm on my current Linux laptop I knew I had returned home! Now my Fvwm config is very nice. It does everything I need (but doesn’t have any extra bloat and clutter), works exactly the way I want it to, is very light on resources, and basically Just Works(tm).
I use Pager to navigate between my 16 desktops (Four workspaces called “Misc”, “Net”, “Code” and “Docs” each having four desktops (2×2)), one menu for the most frequently used apps (NOT for ALL apps because that would make the menu structure too deep and cluttered. And of course xclock in one corner. I use xmessage to display a text file containing pending tasks and notes or anything I want… (and one menu entry for quickly editing the text file). I also wrote one extra app (with Motif) that offers me “Run command” -dialog.
Consistance is hardly important anymore.
The user has adapted to adapting to new UI very quickly.
This must be true otherwise the web(where every page is a new and very different UI) wouldn’t be usable.
Why do people keep complaining about the X11
cut & paste mechanism? It is trivial to use.
copy=select
paste=middle mouse button click.
Faster than ctrl-C, ctrl-V.
Because the action is unintuitive to the untrained user. This is considered bad in the current UI world.
And CTRL+C and CTRL+V is more intuitive?
Edit->Copy and Edit->Paste are more intuitive. Also Right-click->copy is correct from a UI point. It actually tells you what the action you are about to perform is. Of course ^C-c ^C-V are not intuitive, but they form part of a usablility progression. The user is told, via the menu options, that they can use the keyboard shortcuts.
The main complaint isn’t this mechanism for copy and paste. Although it does have disadvantages, such as wiping the clipboard if anything else is selected, or if the document being copied from is closed.
This means that you can’t use this mechanism to paste text over the top of a text selection, or copy a block of text from a browser and then close it before pasting. Using ctrl+c to copy is much more secure and versatile, as long as you don’t copy anything else, the contents of the clipboard will remain available.
However, the main Linux clipboard consistency and usability problems occur when you start trying to copy other kinds of data, such as formatted text and images. If you’re a programmer then this isn’t an issue, but if you want to use Linux for tasks like DTP then it can be a pain.
In my experience copy and paste is one thing that does work very consistently in Windows, Mac OS and most other operating systems. Whenever it fails to work between X Window System apps it doesn’t reflect well on Linux usability.
similar to the article
I think the author of the article wants to actually read what Lubos Lunak wrote regarding Klipper and the general clipboard situation.
I also laugh at this “Every application is in it’s own folder” thing people go on about with the Mac. Yes, it is convenient in some ways, but the the whole thing becomes very muddy when you have many applications using the same core components – which many companies frequently do. Windows (at a stretch) and Linux/Unix systems are designed for that in mind.
I think he’s confused about what this actually is, because it would be possible to browse the RPM or whatever repository and see a virtual folder of all the files the application has – even though they might be in different physical locations. That’s why we use computers, so the physical location doesn’t actually matter.
All in all, not a great article. He touches on something that does need solved though, but he doesn’t really discuss it – software installation. He’s right about configuration settings as well.
IS THIS GUY SERIOUS?!?
on windows registry: “Users of Windows and other systems panned this approach for one good, practical reason: It’s unreliable. Although things have improved, registry corruption is still the bane of the Windows user. Install a new app, and it or the OS does something weird, and bang!, the whole registry is hosed, requiring a complete reinstall of the OS. Single points of failure are bad.”
I mean, like wtf? whenever linux people pick on windows, they usually reffer to win95 or something behind that. I haven’t had registry corrupt beyond repair since win98 (corrupt fat tables due to disk failing, i restored it manually myself), in xp i only remember it happening ONCE, and never due to some stupid new app, but due to a POWER FAILURE! and even that one, windows restored the registry back to working order by itself, i only had to click a OK button.
so when linux people bash windows, make sure you aren’t behind, give xp a try, collect facts, and bash it then! nobody even uses win95 or win98, their dos-based unstable os architecture is gone.
Based on a lot of the comments here so far, it looks like most people here are missing the point of the article. The author is not merely advocating for more consistent UI design, or for just a few solutions to a few particular problem areas. The author is (correctly) pointing out a thematic problem with the Linux/FOSS development model: a failure to understand that the technical architecture of the system directly impacts the usability in ways that go much deeper than the GUI.
I’m a heavy Windows user who has dabbled in Linux plenty over the years. I really wanted to give it a chance, and not give up on it, and undergo the learning curve, because the FOSS philosophy sounds so ethical and promising. If _anyone_ would have the general hardware and software expertise, intelligence, and patience to suffer through the learning/installation/maintenance curve, if would be me… and yet I eventually gave up on Linux for exactly the kinds of reasons this article author describes.
Linux-based system are a chaotic mess with no clean underlying architecture or technical intuitiveness whatsoever. You can’t just solve the user experience problems that creates by slapping a pretty GUI over top of it all. A sloppy architecture imposes problems and limitations that no GUI can possibly work around.
All the focus on GUIs for Linux-based systems in recent years has been seriously misplaced. Before the system is ready for a pretty GUI, it needs some serious architecture cleanup/redesign and major hardware compatibility improvements. Anyone wanting “Linux for regular people” or “Linux on the desktop” concepts to succeed ought to think about abandoning whatever pet development project they are working on and instead work on improving exactly the kinds of things the article author suggests: forming and implementing more detailed architectural designs for the system, coordinating and playing better with other groups in the FOSS community, and being unafraid to take the development risks necessary to replace old core chunks of the system with newer, better-designed chunks.
You are right.
However, there is a tendency to defend the current design rather than improve it. Common software installation etc has mostly been ignored, “just use deb or rpm”.
There are underlying architectural things to change. Let’s see if the developers have the guts to implement them.
Everyone is a usability expert when you’re talking about painting the bikeshed. Any time there’s a story about anything remotely more technical than that, there is a highly predictable correlation with silence.
I do not want a system where easy things are easier and hard things are impossible. The current design is fine. The users are stupid.
I think the arrogant “the users are stupid” attitude is one of the biggest obstacles to Linux adoption. I can run rings around you in chip design. I’ve installed my fair share of Linux systems, including Gentoo, where you have to set up every little thing manually. I think we can be pretty sure that I’m not stupid.
It seems evident to me that the current design is NOT fine. It makes people memorize a load of useless details that do nothing but clutter their minds and distracting them from doing useful, interesting things.
I’m not a Linux user nor an advocate there of.
How could you possibly know if you’re a better chip designer than me? I’ll concede that you’re probably right since the last time I designed hardware was on bread boards in logic design class. Actually I hope this is the case for the sake of your project.
Hier(7) has nothing to do with Linux specifically. It’s a well thought out and methodical way of organizing the filesystem by function that scales well and reduces the need to specify a zillion paths for where everything is located. I can also effortlessly and consistently nest chrooted file systems and have paths align up properly with the only thing changing being the prefix to the chroot. If you can’t see why that’s a good thing, you haven’t been doing systems management for very long or development for that matter. Most alternative schemes are short signed and merely address someones most immediate annoyances. Everything need not be memorized. All one needs to do is RTFM. I’ll even give you a hint: man hier
UNIX has a well established tradition of using plain text configuration files because they are easy for humans to understand and change (of course you’ll argue they’re not). XML is not easy for humans to read nor write nor do configuration files need such a complex markup language (I’m annoyed enough that fontconfig did this.). Why on earth anyone would want a complex and overly verbose grammer over a simpler one is beyond me. I think there’s this fanciful notion that if everything uses angle brackets for lexical tokens that somehow it can be magically understood by anyone without knowing anything about the grammer defined by the DTD. I think the truth is that former web designers from the .com bust like it because it looks like HTML. Binary fixed width random access files (so called databases — as if plain text files aren’t databases!) are evil unless they are necessary for performance (and sometimes they are). Even then, they can be compiled from plain text inputs. I also don’t give a flip about GUI frontends.
The locations of these files is not hard to predict. Global settings are under <prefix>/etc and user configs are under your home directory as dot files. I’m not sure how you could be lost here…? The only reason they would be put in “random” locations is if the programmer was an idiot (Dell KVM switch software anyone?).
How you can praise the Windows registry for configuration purposes when it fell into the use quite by accident? Its initial purpose was purely for COM object registation and, reportedly, was hijacked by the Office team later. Nevertheless, it’s a major POS responsable for no end of troubles. INI files were far superior.
You’ll never hear my decrying hardware for not being “intuitive” to program (what ever that means). I will, however, complain iff documentation is lacking, wrong and/or requires NDAs. The only time I would complain about the design of the hardware itself is if it was broken or didn’t do what it claimed. I’m hoping that this won’t be the case with the OGP ;-).
People can be smart and still be stupid. By the way, how can you use Gentoo and complain about setting up everything “manually” with a straight face?
Actually, I was lying when I said I wouldn’t complain about the hardware.
I won’t be satisfied until I can memory map arbitrarily large XML documents into the video card and have it automagically know what I want while I randomly bang on registers that may or may not exist, and I better not have to read any documentation!
I just wanted to say Thank You for making this comment. You see what I was driving at, and in fact, you expressed some of the concepts much better than I did, more concisely. Very nice. You cannot slap a nice interface onto a poor architecture and think you’ve done a good job.
Here here.
No wonder Linux lacks usability – there is not enough use whatsoever for the operating system in the real life. What is left is technical operating system with desperate attemps to find something meaningful and applicable to do with Linux. That is why Linux is so challenging and a hard operating system.
Where is the key problem? it is always the same. Linux can not have coherent look in its professional applications, since it lacks not the looks but the applications itself. What Linux has to offer: some toy programs and some OS tweakers, maybe a DVD player which looks like a spacecraft cockpit but can be customized to a teenager’s puke. That is not usability, That is self expression.
And the same goes for the OS. There may even be pseudo- polished-look but it is really without any usability. Things are not in the right places or they may vary. No wonder, since there is not any efforts to unify it through all tasks. I am kind here when I assume that it is just because of lack of self discipline insead of lack of skills in system development.
I need a proper – widely accepted base – which anybody can customize afterwards if they like. But they should offer a proper and modern look and a polite interaction as a basis. Something like what Ubuntu have but much better.
For a stereotypical (and the most common) Linux user – unfortunately – this is enough, so there is no need to polish things. Linux is still not smooth enough. Well, maybe compared to Windows 98 it is.
What OSX does right is to hide hideous tasks and help users in more complicated things. What Windows does wrong is to try to hide the basic structure of the computer and to help with the most obvious things. And what linux does for it? Well, it does nothing.
Less customization and more empathy for the user is what I miss.
Edited 2007-03-31 15:58
People use Linux with windowmaker to make CG film effects with Maya, looks means nothing.
I think the main problem Linux has is in the wireless department.
But Linux 2.6.21 is getting a brand new 802.11 stack called Devicescape. It boasts a huge improvement in the wireless arena.
I tried the 2.6.21rc5 kernel and my 4311-based Broadcom worked simply by installing the Linux BCM firmware and connecting via Wifi-Radar.
As for the Apps, I disagree about their usability. While most apps might not have shiny face plates like their Windows counterparts, they serve their purpose.
The advanced features missing in some FOSS apps, such as CMYK separating in GIMP, and OpenXML support in OpenOffice are attainable from 3rd party plugins.
E.g.
CMYK separation: http://www.blackfiveservices.co.uk/separate.shtml“>Separate
Photoshop feel:
<a href=”http://gimpshopdotnet.blogspot.com/“>Gimpshop
<a href=”http://www.kanzelsberger.com/pixel/?page_id=12“>Pixel
OpenXML plugin:
http://download.novell.com/SummaryFree.jsp?buildid=ESrjfdE4U58~“>…
Hell, how about even a commercial Office solution for Linux:
http://www.softmaker.com/english/ofl_en.htm“>SoftMaker
So my point is solutions exist to many of the problem in Linux software. One need only look for plugins or different alternatives.
Actually, dscape won’t be in 2.6.21, it is in -mm right now but still needs some work. bcm43xx in -rc5 is still using softmac. But that’s not to detract from the fantastic work the bcm43xx guys have done. There have been a couple of major patches pushed through that significantly improved 4311 support since 2.6.20.
Since you’re running -rc5, try the latest cumulative patch from Larry Finger (one of the bcm43xx devs) here: ftp://lwfinger.dynalias.org/patches
It will make it into final, but in the meantime, you should notice better overall performance with it.
I’m gushing a little bit, because of the sheer frustration I’ve gone through with wifi support for this little bastard of a chipset from Broadcom. With 2.6.21 and the latest patches, I’ve actually got better performance with the native drivers that with the windows ones under ndiswrapper. Not bad considering that 6 months ago the driver couldn’t even deal with PCIe, let alone get decent performance from the chipset.
Back to our regularly scheduled thread…
The UNIX file structure groups files by functionality. This allows for improved security and performance. For example, you can mount /tmp and /var with a no execute option, /usr and /etc can be mounted read only, while /usr/local can be mounted read write if you want users to be able to install files there. This can improve security if you prevent file systems from being both writable and executable. (In my example, /usr/local would be vulnerable since it is both writable and executable, you would have to trust some other mechanism like user permissions to prevent attacks against this file system). Having separate file systems also prevents much of the disk fragmentation that is still a common occurrence in Windows. Placing the executable files and the temporary files that the executable needs in the same directory means that every patch is likely to fragment the binary files.
I will concede that the UNIX system is harder to understand, but the file system structure is not arbitrary and it does offer benefits. IMO, the key is to have a good package manager and a simple rule like ‘install everything without a package in /usr/local’.
The UNIX approach has the significant advantage of allowing you to consistently find your file as /home/username. In windows, you have to know which drive letter contains the ‘Documents and Settings’ folder in order to find ‘My files’ via the file system.
I see there are a whole lot of replies that say “Well, linux has X Package Manager” or “Linux has Y Package Manager” when the point is not that there needs to be a better package manager or that it is hard to manage packages, the point is that one should not even be needed. If I want to install an application in OS X, I drag it to my applications folder. Don’t want it anymore? I drag it to the trash. EVERYTHING for the app is within the .app. Sure, i can use apt or portage or whatever you want me to use, but in the end, after all those tools do their little thing, I still got the entire application spread across 8 different folders. That is a nightmare as far as system management goes, because package managers do break..
Consistency and organization are great, and if those two things were adopted, it would make life much easier on everyone, esp. poor techs like me who have to manage these systems
Edited 2007-03-31 21:42
Yes, it is easy to drag apps to the application folder, but you seam to forget the steps you need to do before that, i.e. search the Internet for the app, determine if the Internet site where you found it can be trusted, download it, unstuff it until you finally can do the easy simple step of dragging it to your Applications folder. The package manager will help you with all these steps.
Then you have the problem that all the MacOS-X apps doesn’t go into the Application folder. There are plenty of apps in /usr/bin, /bin /sbin, that just like their Linux siblings are split up having dependencies elswhere in the file system. If you want to remove or change one of these, Apple have left you totally on your own. If all these apps should have been installed in a folder of their own there would have been hundreds of such folders that would be hard to manage without some kind of database support (i.e. a package manager).
Not to mention that it would have serious performance implications.
Using package managers is a great improvement over just dragging files to the Application folder. When the application is installed there is no reason for the user to search the files ystem for it. Where the files goes is irrelevant to the user.
On the other hand, Apple do have a point in hiding files the user doesn’t need to know about (such as /bin/*, /usr/bin/*…). I wish Linux did the same. At least in Gnome this is allready possible to do directly out of the box and it would be a great idea to have this turned on by default. Advanced users that for some reason need to see these files are much more likely to know how to unhide them.
It should not matter what is the underlying filesystem hierarchy. We already have a kind of “Every application is in it’s own folder” approach – the Start menu.
(Or taken to the extreme, the /Programs/TheBestAppEver approach _could_ be achieved by a big pile of symlinks, if really needed, but…)
Why should the user need to know where the _files_ for certain application are? For starting the program she/he needs to remember where it is in the Start menu/Kicker/Whatnot. And even better, just remember some part of the name of program and SomeAppLauncher(TM) will find it for her/him. Or maybe just typing “text”,”write” or “document” and it will offer a collection of programs for those tasks.
Those things are already invented. “No need to know internals about the file system for starting programs” – check. Ok, looking good, starting to sound like a operating system I want to use. The only weak spot currently is the ‘config’-spot. We are just one centralized config system away from needing to access the file system manually. Any volunteers here up to the task? Google SoC 2008 anyone? Elektra maybe (http://elektra.g4ii.com/)?
The suggested xml-part sounds good (human readability), and so does the key/value ideology (unique key for each configurable item). I would add to that some sort of indication whether user has modified the value, with timestamp when it was done and storing of all previous values to keep track what has been done to that config variable from day 1 (eases the troubleshooting on upgrades and user-based mistakes). By default, upgrade would not override the user-modified values.
All we need now is the generator that inputs OSNews/Slashdot/KdeGnomeWars and outputs code so the relation of talking vs. actual code would be more on the code side.
And talking about adapting innovations, are we? Well how about this: Right clicking any application from the Start menu (or any application menu used to start applications) gives options like ‘Configure’, ‘Uninstall’, ‘Upgrade’ (this displayed only if upgrade exists), ‘Find similar’ (searches other similar programs based on meta information given for each application, eg. music,player ->XMMS,XMMS2,AlsaPlayer,YouGotThePoint,…)
In almost perfect world ‘Upgrade’ would offer option to install the upgraded version along with the current version, to allow eg. beta testing new features without finding yourself repeating the mantra “If it’s not broken, don’t fix it!” – if you happen to have the syndrome where you just can’t leave your fingers out of the new cool stuff (been there, done that with few Kde major updates, way before official distro releases)
reduz complains about packaging difficulty, and I agree to that. Several times I have wanted to install some .tgz released software for debian in the debian way. And since no official .dep exists I wanted to create the .dep myself. But never I have gotten to the end of the process, several times I have started “This time I will finish it”, but at some point it just gets ugly and I give up. I don’t want to fill any dependancies, I don’t want to fullfill the debian rules. I just want to mess up my system in just a slightly controlled manner, instead of plain make install. Because most makefiles don’t include the :uninstall part. And I might want to uninstall it someday. Maybe I should seriously consider CheckInstall and forget the package managers.
On UI consistency: Setting the page layout in OO,AbiWord,KWord or creating/managing tables gives me headache. In OO and KWord, the page layout is under Format menu, Abi has it in File menu. Abi and OO inserts table from Table menu, but in KWord it’s from Insert. All are good places and intuitive maybe (and ideology of having several ways for one task is bad, I quess, is the reason KWord does not have the Insert option under Table menu also). Good examples of lack of “standards”. Shouldn’t fd.o have the final word on how should I insert a table
to the part about Windows keeping everything in Program Files as mostly a good thing.
It’s not.
I don’t keep everything in Program Files.
I store my executables on a separate partition.
Why? Because I add and remove programs repeatedly. In general, the file space used for that is likely to grow over time.
Since Windows system and temporary files tend to grow over time as well, I don’t want the hassle of having to worry about when “root” will fill up and the system will fail to run correctly.
So I separate functions, as they should be. The system is in one place, applications are in another.
Windows wants to slam Program Files on the C: drive – and most people let it. I install all my stuff – except idiot programs that insist on being installed to the C: and don’t even give you the option of changing – to a separate directory on a separate partition.
The UNIX/Linux way is perfectly understandable (well, almost perfectly.) The Linux Filesystem Hierarchy is a STANDARD that most people follow and some people ignore. It specifies what should go where and WHY. There is nothing wrong with spreading binaries in multiple locations around the system – as long as you know WHY a binary is located where it is.
Linux developers are not good at that, but that is not an issue with how things should be done.
If you want to complain about something, complain about how both Windows and Linux tend to have short, cryptic names – and occasionally long cryptic names – for directories under their main system directories. It would be nice if an end user could easily tell what each directory was for by the NAME (or better yet, by metadata that would pop up when the name is clicked on.)
Let’s face it. System programmers – and programmers in general – are NOT usability experts. They shouldn’t be allowed to develop interfaces or name anything. They should be allowed to IMPLEMENT under constraints and that’s it!
How are installing into “Program Files” and installing onto a separate volume mutually exclusive?
What I’d love to see in the Windows world is for none of the software vendors to touch the Windows directory; there is absolutely NO need for them to copy files into it, place libraries in it or anything.
If these companies *feel* the need to bundle *.dll’s with their applications, may I politely suggest that they put all their crap in *one* directory, rather than sprawling it accross the hard disk, and worse still, over write system files with their crap because they didn’t use some gray matter when writing the application.
The stupidity by Windows programmers is almost the equivilance of someone creating an application for Linux, then bundling glib, gtk, libcurl, gettext etc. and then over writing a whole wack of files in one shot because the idiotic, half witted programmer didn’t use his brain.
How about Windows fix problems like *that*, and MacOS X immune to stupidity? explain the crap sprawled from one end of the hard disk to another in the form of configuration files; great, you delete the application, but then plagued with a list of application configuration files chewing up megs for no reason.
As for the Linux/*NIX structure, it varies from distro to distro, but Linux, its /usr and / for files relating to the system, /usr/local user installed files.
With that being said, I prefer the Solaris method of putting user installed files into /opt, seperating it completely from the system altogether.
Edited 2007-04-01 01:33
What I don’t get is when even Linux proponents, Linux developers, and Linux writers point out its flaws, its meet with 200+ replies of how the guy has no idea what he’s talking about, he’s biased, he’s retarded etc….
Does Linus himself have to write the critical article for you guys to accept it?
@corrosive23
I modded you down…ad I don’t think I should have done.
There is lots wrong with linux. codecs; commercial quality games; open source 3D support; dvd authoring software etc etc. These are solid things that linux is definitely not perfect with.
but issues raises around clear leadership; directory structure; licensing; package management these things are not clear cut. There is no right or wrong. Only large grey areas and alternative solutions. I actually think the article is bad at presenting any point of view so this thread has become another mudslinging thread.
You on the other hand make no points. Linus is just one opinion among many albeit a respected one. GPL3 is just one example of how many disagree with Linus including myself.
Where does VS8 keep it’s main executable? Obviously it’s under it’s Program Files folder _somewhere_, but nowhere totally obvious. In the end I abandoned it and opened the file from inside VS, but there was still this niggling doubt saying “on my linux box I wouldn’t have had to know, it would have been in the path. easy.”
Right click the icon, go to properties & it is in the first tab.
One of MANY things that Microsoft got WRONG in windows is the central registry. Another is DLL hell. Those as much as anything is the core weakness of their OS and why it degrades into something with bad performance and crashes.
You can say that developers used it wrong. Well they can only use something wrong if you let them. MS did. They should have fixed that.
The Right Way is to have each application have its own folder with everything it needs to run except for core systems (including the GUI standards).
“One of MANY things that Microsoft got WRONG in windows is the central registry. Another is DLL hell. Those as much as anything is the core weakness of their OS and why it degrades into something with bad performance and crashes. ”
————-
DLL hell? Are you serious? I haven’t encountered that since Windows 98, and even then only very rarely. Very few applications for windows bother with shared libraries any more (since hard drive space is ludicrously big now, and pretty cheap), almost everything is contained in the folder it installs to. Even the registry is getting less and less used by apps these days – Windows apps are starting to become more like OS-X, where you can drag and drop the application folder anywhere and run it from there.
OFten, programs that do use the registry automatically regenerate entries, so you can “reinstall” an app by dragging it into a folder and running it.
Probably half the apps I have installed don’t touch the registry, and almost none of them (except for MS Office) use shared libraries.
I have seen the ‘Apple’ computers and they lack functionality, kindergarden keyboards and mice give me a break.
I would use Windows Vista before I would use a Apple-Mac whatever you want to call this botched up OS.
The Linux distro’s have something Apple does not have a user base and constent improvements and new features. Linux distro’s are the mainstay and the future not some locked down expensive hardware cartoon desktop called Apple……
All this “Linux (or Unix) needs to be this or that in order to succeed” is irrelevant. Unix is a family of operating systems with a long and rich heritage. Windows and MacOS (which has Unix underneath) are culturally fairly different beasts. We Unix people use it just because for us it all (/etc, /usr, package managers, shared libs, etc.) just makes sense. And to us Unix guys Windows seems just extremely illogical, inflexible, slow and dumbed down… If Windows or MacOS are what you want just buy either of them and be happy!
One thing that really annoys me is that some people seem to have an obsession to turn Linux into some cheap Windows clone. Why should this be done, I wonder? What is the point of that?
As for software installation. I think package managers work just fine (provided the package manager is good and packages themselves have been made properly). Especially if you use a distro with comprehensive set of packages (e.g. Debian, Gentoo or FreeBSD (OK. the last one is not Linux, but still). Some more limited distros (like RHEL) can make life fairly difficult as the amount of packages is not sufficient.
Don’t use OS X then. (I have a PowerBook, so I can poke fun at it.)
Competition for Microsoft and Apple. Competition induces Darwinian physics, and weeds out the sick and lame.
People like the OS, so naturally they want to use it on their laptops, desktops, or whatever.
Package Management isn’t bad, especially with FreeBSD’s ports, but I do understand what the author is saying.
I actually don’t use MacOS (I don’t have a mac…) 😀 Seriously though, I have to admit that Apple has been somewhat successful in bringing “Average Joe” and Unix worlds closer together than anyone else. Nevertheless, I believe I would not want to switch… To me Apple and Windows are quite alien systems.
Here is the kind of configuration I’m used to:
My window manager is Fvwm 2.4. I use Pager to navigate between my 16 desktops (Four workspaces called “Misc”, “Net”, “Code” and “Docs” each having four desktops (2×2)), one menu for the most frequently used apps (NOT for ALL apps because that would make the menu structure too deep and cluttered (most apps are started from command line or “run command”-dialog I wrote. And of course xclock in one corner. I use xmessage to display a text file containing pending tasks and notes or anything I want… (and one menu entry for quickly editing the text file). I also wrote one extra app (with Motif) that offers me “Run command” -dialog. This and lots of xterm windows, Emacs, Vim, Xpdf (sometimes KPDF), Firefox, Lynx, xine, mplayer and xmms are basically all desktop apps I need. Somehow I think I would find it rather difficult to adjust my working habits to Mac or Windows environment. Just as a Mac/Windows user could probably not use my config… Ah well, to each his own I guess…
Competition for MS and Apple… That is a good reason to develop user friendly systems, but cloning Windows and Mac are not the way to do this… Instead of trying to reimplement MS Start menu, MS Explorer, MS Installer (MS Linux…) we should develop a consistent, good and original system. I think innovative is the appropriate buzzword….
Okay… this part confuses me.
“and also like Microsoft, they’re too afraid to change, sticking to out-dated methods in the name of backward-compatibility or inertia or both.”
I always thought Linux companies/communities were much LESS concerned about backwards-compatibility (BTW does ‘backward-compatibility’, without ‘s’, sound like a swearing term to anyone else?).
I’m not sure what he means by this?
Edited 2007-04-02 07:17
I don’t even want to read the article. Sorry. If the author had more of a sense of the nature of the Linux phenomenon, he wouldn’t have used such a title.