Linked by David Adams on Wed 24th May 2006 04:08 UTC
Editorial It's conventional wisdom that computers need to be "easier to use." But do they? More reliable, yes. Easier to troubleshoot, yes. But now that so many people use computers so much, I think there's something to be said for making them less easy-to-use and less intuitive.
E-mail Print r 12   · Read More · 44 Comment(s)
Order by: Score:
Deus Ex Machina
by transami on Wed 24th May 2006 15:47 UTC
Member since:

Okay, maybe you about face by the end of your article -- I admit I haven't finished it. But let's be clear, the opening is about the worse piece of "RTFM" illogic I have ever read.

Until computer interaction is as easy as having a conversation with a fellow (and reasonablly rational) human being, they are not easy enough. PERIOD.

Any familiarity with the way computers work now is but a crutch on the road to the eventual personfication of our automa.

Reply Score: 5

RE: Deus Ex Machina
by Bnonn on Wed 24th May 2006 22:15 UTC in reply to "Deus Ex Machina"
Bnonn Member since:

It sounds like what you're looking for is a person, not a computer. Why should a computer have to act like a person? Do we expect any other piece of equipment to be anything other than what it is? Any complex piece of technology requires some knowledge and training to use. This is how technology is. I don't /want/ to talk to my computer. I don't /want/ to say some semantically correct, and even colloquial English sentence to get it to copy files. I want a reliable, consistent method that I know will work without the computer having to interpret my voice, words, accent.

I think you're being unreasonable and illogical. Sorry. Perhaps you could explain why you think computers are not "easy" enough if you can't talk to them like a normal person. I can think of some improvements I would make to my computer if I could, but a verbal interface, or any system which avoids me needing to be consistent, is not among them.

Reply Score: 3

RE: Deus Ex Machina
by Luke McCarthy on Thu 25th May 2006 08:53 UTC in reply to "Deus Ex Machina"
Luke McCarthy Member since:

Until computer interaction is as easy as having a conversation with a fellow (and reasonablly rational) human being, they are not easy enough. PERIOD.

I dunno, I'm sure the 'sterotypical geek' finds it easier to use a computer than hold a conversation. ;-) It's not like the goals are the same. Computers are more than just communication.

The personal computer is by far the most unreliable piece of equipment in the modern household or workplace.

Tell that to my kettle that blew up! In all seriousness, software reliability is a difficult problem, but it will not be realised by the current IT industry or 'mainstream' open-source. To quote Hoare, for reliability simplicity is an absolute prerequisite. Simplicity is HARD.

Reply Score: 1

by palodequeso on Wed 24th May 2006 15:56 UTC
Member since:

I understand that computers may currently suffer flaws resulting from making things too user friendly. Think about it, the higher level your code, the more layers of libraries you tend to use. More room for error creeps in, but I disagree that having computers that are not user friendly is needed. The more people we have using computers the more information they can gather and use to get things done more efficiently and then they can move on to spend time on more important things like family or perhaps even furthering technological innovation.

Reply Score: 1

by ple_mono on Wed 24th May 2006 15:56 UTC
Member since:

I agree on some of the points, but not all.

I guess the ruby language "Priciple of Least Surprise; things work the way you would expect them to" applies here. Well it would ne nice if they did anyway...

Another example is KDE. It does what you would expect it to, but you CAN do it another way (often more elegant) if you choose too. It still allows you to do it the easy way, or the "alternate" way.

I think OO programming makes complex yet easy application GUI easier to design, IF you choose to. But i still think you need to make GUI:s sort of intuitive without dumbing them down.

Also, i like how apps (especially in KDE, but gnome as well) have a habit of "do things the same way" if you see what i mean. It sets a standard, and common shortcuts are easy to remember.
Windows apps have a tendence to NOT work the same way, which is too bad.

Edited 2006-05-24 15:59

Reply Score: 2

RE: standards
by devurandom on Thu 25th May 2006 14:17 UTC in reply to "standards"
devurandom Member since:

Ssssssssh! How about the lives of those linux bashing fellows that always scream on forums "TEH LINSUX INTERFACE IS NOT CONSISTENT"? Don't give them a pain! ;)

Reply Score: 1

"Easy to use" isn't a bad thing.
by rcsteiner on Wed 24th May 2006 16:13 UTC
Member since:

However, I think the common desktop computer should be made either (1) more difficult to alter, or (2) far easier to recover to a known stable/secure state.

Put the core OS and various utilities in ROM, and let the user add applications on disk all they want. OS updates are sold on ROMs that you swap in for your existing ones.

Alternatively, build disk imaging software into the BIOS (or some other ROM) in the machine, and allow for one-touch backups and system restorals.

Right now, a computer is a meta-machine which is almost completely vulnerable to the whims of the uneducated public, and in an interconnected computer world that is a fairly dangerous thing...

Reply Score: 2

macisaac Member since:

sort of like commodores in the 80s? actually, I kind of like that idea, imaging what possibilities a system of today with its OS burned into the ROM would be like from a stability, security and convenience stand point. instant on and instant off, and no way for intruders to hack the core system.

Reply Score: 1

hobgoblin Member since:

hmm, i recall making similar posts on usability topics here on osnews before and getting virtual stones trown at me ;)

to me it appears as if the home computer would be better of if it was based on a collection of parts that could talk together, but could do their individual tasks seperatly.

the computer itself would only be a kind of connections box that allows the diffrent devices to talk together. plug in a scanner and you can scan images, plug in a printer and you can turn on both and have a copier. plug in a keyboard and you can write books and maybe do spreadsheets. plug in a modem and you can do web and mail. slap a normal numeric keyboard on top of that modem and you can even do faxes. just hit scan on the scanner. then hit the number and "call"...

the gui would be generated by the connections box, but the individual devices would handle the "computing". and many basic functions would be available as buttons on the diffrent devices themselfs.

ok, so the power user would scream murder. but this device is not for the power user. the allready existing PC is for the power user ;)

with xml based filetypes, these two systems could share files without problems.

want a games-console with this setup? a dvd-player+a console computing device, complete with controller ports. so when a new storage format comes out, you can add the player for it into the mix, and the game makers can use it for the console from day one ;)

the trick is a rich communications protocol between the devices, instead of drivers. kinda like how usb today gets more and more "profiles" buildt in. pictbridge, usb storage, usb hid. its all there. now trow in a xfree-like gui language (xml based).

say that when a new device comes online, the "computer" asks for a device icon, and have it transferd. then this is displayed on screen. select the icon and the computer asks the device for its "gui". basicly a xml-based layout of buttons, menus and whatesnot. from there you can access all the functions normaly assosiated with said device.

to take my scanner+modem=fax above. the scanner scans, and then store said data in a predefined format in its own ram. it allso broadcasts this to every other device. now it tell the modem to dial a fax number. when the modem encounters the fax, it starts to send the data from the scanner, and the scanner purge the data from its ram when task is completed.

if you instead wants to fax a stored file. dig the file out of whatever storage device its on, put it up for grabs, then dial the fax. print it? up for grabs and get the printer going ;)

in many ways, the problem of the computer today is to much reimplementation. each device or hardware addon have their own drivers rather then relying on common standards. even worse is when companys take a common standard and extends on it without releasing said extensions for potential review and implementation into a revised standard.

if so, then one could have each device transmitt what version of the diffrent standards it supports, and the others could from that know what functions are not supported.

Reply Score: 1

Re: Disagreement
by peejay on Wed 24th May 2006 16:18 UTC
Member since:

The more people we have using computers the more information they can gather and use to get things done more efficiently and then they can move on to spend time on more important things like family...

I'm not sure that anyone has ever in the history of computers been able to go home early because they finished their work quicker.

On the contrary, computers make you easily accessible and therefore decrease your free time.

Reply Score: 2

RE: Re: Disagreement
by ThawkTH on Wed 24th May 2006 16:42 UTC in reply to "Re: Disagreement"
ThawkTH Member since:

I think there are some examples. Particularly for students.

Writing a paper that must be typed. Typewriter, or computer? I do think, particularly with huge papers, computers have sped things up.

If one knows what they're doing (I'll concede most don't) the internet has made research and fact checking tremendously easier.

I don't think the flaw is computers themselves, only their implementation. Most people simply don't know enough to get the most out of working with them. SO many people spend so much time figuring out how to change the font or switch to landscape that they could've just handwritten the stupid thing in the first place ;)

Moral of the story is: people should make it a point to know what's going on. They never will. Therefore, we either shut a huge population out (companies will never do this), or try to make things simultaneously easy, featureful, and powerful.

Reply Score: 2

I agree.. to an extent.
by naelurec on Wed 24th May 2006 16:22 UTC
Member since:

The author makes a good point that the the full capability of a computer is not realized and as a result, productivity suffers.

Ultimately, I believe this comes down to personal desire to achieve higher levels of proficiency and productivity. Perhaps this requires a shift in mindset. Many computer user activities are highly repetetive. If users could recognize this behavior, they might start to look for solutions (shortcuts, macros, regular expressions, scripting, etc..). Though I don't see this happening. Without a solid knowledge of how an operating system functions, ascertaining these more-advanced skillsets is difficult or even impossible.

As far as reliability of the PC, computer hardware and software can be quite reliable. Server-grade hardware with built-in redundancy running a solid OS can provide many years of reliable service. I know of systems that fit in this category that have been running 24 hours a day for 7+ years and they are still operational with minimal maintenace issues.

The problem is very few people want to pay for this level of reliability. Low cost and familiarity are higher priority. I believe priorities are significantly different when buying a computer vs a car. I can go on about how *generally* people will not attempt to fix their cars nor have the local kid in the neighborhood attempt to fix it. Essentially they get what they pay for (in terms of both monetary investment and time investment).

Reply Score: 3

Power vs. simplicity is a false dichotomy
by DHofmann on Wed 24th May 2006 16:23 UTC
Member since:

"will the computing world truly be served by catering to their needs at the expense of everyday computer users?"

This implies that power and simplicity are mutually exclusive. Here's a counterexample: TiVo. It's much more powerful than a VCR, and yet is much easier to use to record your favorite programs.

It takes creativity to make something simpler without sacrificing power, but it can be done.

Reply Score: 5

Mystilleef Member since:

Well said. For years I've been advocating this stance, until
I broke down and started writing my own applications. It's
unfortunate most developers often feel that powerful systems
need to be hard to use, complex and have intimidating
interfaces. I often need to point to google to annihilate
this distasteful myth.

Reply Score: 2

AnalystX Member since:

Precisely. The author doesn't take into account the nature of intelligent automation. Simple is good. One button that forces you to do a dozen things when six things is all you want done is bad. We want our lives more automated in order to save us time, but we strive to have that automation be as intelligent as we are.

Your example of VCR versus DVR is perfect. I know my ReplayTV has saved me hundreds of hours over the course of its use (a couple years) because it's able to make some decisions for me. The key to the future of computing in general, is designing them not only to mirror the decisions humans make, but also learn how humans make the decisions.

Reply Score: 2

hobgoblin Member since:

bayesian filters and similar?

didnt microsoft work on that? and one of the results from that was clippy?

Edited 2006-05-25 13:14

Reply Score: 1

AnalystX Member since:

Clippy is a poor example of intelligent design. It's biggest disadvantage is that Microsoft created it. There are two kinds of examples that exist on either end of the intuitive/intelligent scale.

On the more intuitive end, extending the desktop/folders/files metaphor to include tabbed file/web browsing. This has saved an untold amount of time for a lot of people. You couldn't pull off anything like the efficiency of tabbed browsing in a CLI.

On the more intelligent end, Stanley, Stanford's entry (and winner) into the 2005 DARPA Grand Challenge, is a great example of a computer making decisions that a human would normally have to do.

Hopefully, those in the computer industry will see the same vision for augmenting everything from lifestyles to mundane tasks. If done right, general computing will be more like using Star Trek's LCARS, and everything else will be specialized in much the same way lawyers, doctors, carpenters, and accountants are. Personally I'd like to see robot lawyers. Justice might be better served, and if it isn't, we can recycle them into something more useful.

Reply Score: 1

hobgoblin Member since:

i think the real problem with clippy was that it seems they didnt try to detect what the users problem was and give a simple list of suggestings. they only tryed to detect if a person had a problem and then ask if he wanted help. so yes its a poor example.

however, im not sure that "stanley" is a good example either as that was a purpose-buildt device for a very specific kind of problem. a intelligent computer interface must be able to handle a increasing list of jobs, and understand what each user wants to do...

its the classical "do what i want you to do, not what i tell you to do".

Reply Score: 1

AnalystX Member since:

Of course Stanley was purpose built. That was part of my point. General computing won't need to exist for any purpose other than to provide us with information. That's why I gave LCARS as an example. Everything else will be purpose built, just like us humans are purpose driven. We don't need computers to actually be us, just do the things we don't want to do. Keep in mind that humans that are "jack-of-all-trades" are rarely experts at anything. I think people have pushed general computing too far in that direction. Computing does not need to become any more general.

Reply Score: 1

by tomcat on Wed 24th May 2006 16:29 UTC
Member since:

While I applaud the author for exploring such a contrarian subject, I think that his portrayal/understanding of operating systems is a bit oversimplistic. An OS isn't the shell that the user uses. At its base, an OS is really nothing more than a boot loader, a kernel, and device drivers. It doesn't get more rudimentary or difficult to use than that. And since nearly every OS offers a tiered model (ie. driver <- kernel <- user app), by definition users can make things as complicated (or simple) as they like. They simply choose the tier that they feel comfortable working with.

Nonetheless, he's right about the need for better organizational paradigms for the massive collections of music, images, and other data that users are accumulating. The desktop concept is getting pretty old. Yes, it's still useful. And, yes, it still gets the job done. And, no, I'm not advocating that we drop things in favor of a 3D interface. But we should consider alternate ways of organizing data; for example, timeline-based indexing, network dependency graphs, etc. The tough thing is that people process information in very different ways, so what works for one person may be completely incomprehensible to another. Trying to shoehorn everyone into the same paradigm has reduced us to using the least common denominator and has basically stifled the evolution of alternate indexing schemes. Organizations such as Microsoft, Gnome, and others need to become more brave in developing alternate shells.

Reply Score: 1

RE: Ummmm...
by hobgoblin on Thu 25th May 2006 13:21 UTC in reply to "Ummmm..."
hobgoblin Member since:

so, the best would be a kind of database where each "file" is stored with a host of predefined data, but allso allows the user to add more data?

and then have a search system that allow you to basicly filter these files based on all this "metadata"?

sounds somhow familiar...

give me every file created or modified between date x and date y, sorted by increasing time. ignore all file-types except text files.

network dependency graphs?

Reply Score: 1

Less intuitive?
by jrichey_98 on Wed 24th May 2006 16:35 UTC
Member since:

I can mildly see where the author is comming from but there is never an instance in an OS where being less intuitive would help it.

It's alot easier to explain and operate something of when your told you think 'that makes sense'. Admittidly couldn't finish the article but he's going about things the wrong way.

Add features, doesn't make thing's harder to use for goodness sakes or by any means 'less intuitive'. A person should like my favorite browser be able to choose what features he want's to use (extensions) and have his own OS behavior. But not a harder less intuitive one.

Reply Score: 1

You are all just a bit unfair
by alcibiades on Wed 24th May 2006 16:46 UTC
Member since:

He's making some worthwhile points.

1) Graphical wizards when endlessly nested aren't necessarily easier than text based tools.

2) The desktop/folder metaphor is very attractive to new users but it may be, compared to a proper file manager, a hindrance to understanding.

3) What sells new users in the shop display may drive experienced users crazy if they are compelled to use it.

4) Maybe the industry obsession with ease of use for new users is getting it wrong. His example of video remote controls is very interesting. I would include texting on mobiles as a similar example - as far from conventional human interface design and ease of use as you can get, but it took off like a bomb.

5) The most interesting thing is that he is a Mac user. Now its not the first time I have heard of Mac users throwing out the dock, clearing the desktop, and going to an empty desktop and a file manager when they actually need to manage files. Is it possible that Mac users, having been the first to embrace the desktop metaphor, are also the first to come to the end of it?

What I found when introducing naive users to Windowmaker is that it was surprisingly well accepted. No desktop icons except program icons, multiple desktops, use a file manager to find your files. All totally contrary to the Human Interface guidelines which were inspired by Xerox and Apple 20 years ago. And yet the universal reaction (including from old ladies of 70 with computer phobia) was 'of course I can use this'.

Makes one think.

Reply Score: 4

RE: You are all just a bit unfair
by jaylaa on Wed 24th May 2006 18:17 UTC in reply to "You are all just a bit unfair"
jaylaa Member since:

Another example I'd like to give is Latex. Oh the pain I went through the first few times I tried to write with it. But it gets easier everytime. And faster.

Now I'm to the point where a program that makes it easier for newcomers like Lyx or Scientific Workplace are actually a hindrance to me, and I'm glad I don't have to use them (unfortunately I didn't have those programs when I was new at it).

The point I'm trying to make is (well, the same as the article and everyone else who got the article): what is easy for the unexperienced can be a hindrance to the experienced. In the case of Latex it's the fortunate circumstance that there's something for both. But for operating systems, if it weren't for alternate OSs and window managers and 3rd party hacks, experienced users would be forced to put up with the 4 button remote, to use the author's analogy.

Though the article seems to imply that you can't have easy for the beginner and efficient and functional for the experienced at the same time. I think you can, just not with the same interface. Which is why having different desktop environments on the same OS is so great.

And think about this: what if we all were forced to use Windows or OSX with absolutely no tweaks? Would so many people on a tech site such as this be clamoring for easy to use systems that their grandmother who's never seen a computer could use, or would they too be asking for more advanced systems that let them work efficiently without wizards, balloons, clippys and other eye-candy?

Anyway, yeah people are just flaming the article as if he just wants computers to be hard. He doesn't want them to be hard, he just doesn't want to lose more functionality.

Reply Score: 4

The Anti Mac UI
by stew on Wed 24th May 2006 16:47 UTC
Member since:

Written in 1996 - I recommend the paragraph "Expert Users".

Reply Score: 3

RE: The Anti Mac UI
by alcibiades on Wed 24th May 2006 18:18 UTC in reply to "The Anti Mac UI"
alcibiades Member since:

Nice link, great article. Very thought provoking, and very early to see it all.

Reply Score: 1

Confusion reigns
by Cloudy on Wed 24th May 2006 17:20 UTC
Member since:

The author is not, despite the title, arguing for making computers "more difficult". He is, rather, confusing, as many people before him have, ease of learning with ease of use.

It's what Apple missed when they reimplemented the Xerox interfaces on the Mac. Originally, they missed it because they didn't have room for it, but once left out, it got lost, and is only slowly being restored.

The features of any system that are easiest to use change for you as you practice with the system. The casual user needs features that are low on the learning curve, so that they don't have to invest much effort in learning them to do the samll amount of work they intend to do. The serious user needs features that streamline their workflow, but the law of requisite complexity guarentees that these features will be difficult to learn.

The trick is to apply both. Modern GUIs come along way towards that by having keyboard shortcuts and intelligent context based menus, but they fall short by not having reasonable scripting.

Reply Score: 4

by ronaldst on Wed 24th May 2006 17:50 UTC
Member since:

"It appears our database has momentarily gone down.

Please refresh the page or check back soon."

Still can't get to page 4.

Reply Score: 1

A moot point
by Pseudo Cyborg on Wed 24th May 2006 18:46 UTC
Pseudo Cyborg
Member since:

The author really makes a moot point. A computer can be as "difficult" or as "easy" as the user wants to make it. Power Users can use the shortcuts they want and disable things as they see fit for themselves. Novices can continue to have their hands held.

It's the beauty of option. Not everyone is the same.

DHofmann also makes a very good point in the comment above:
"[The article] implies that power and simplicity are mutually exclusive. Here's a counterexample: TiVo. It's much more powerful than a VCR, and yet is much easier to use to record your favorite programs.

It takes creativity to make something simpler without sacrificing power, but it can be done."

That was the most well-said idea on the subject and a perfect example of something done quite well.

Reply Score: 2

by j-s-h on Wed 24th May 2006 18:56 UTC
Member since:

I refuted the article.

Reply Score: 2

RE: Refutation
by starnix on Thu 25th May 2006 16:46 UTC in reply to "Refutation"
starnix Member since:

Just playing devils advocate here but read the following paragraph.

"A more reasonable example: Is it stupid to expect the computer to save the work automatically? Imagine Bob, a person new to computers, who types for an half an hour or so, assuming that the program automatically preserves his work, then when closing the program, misinterpets the do you want to save? dialog as "do you want to save the last few minutes of work?" and answers no. Interfaces need automatic save, which would make it easier for both acclimated and unaclimated users."

If Bob Misinterprets the dialog as "do you want to save the last few minutes of work?" and answers no, then automatic saving would be doing EXACTLY what he doesn't want to do by saving his work automatically. How is this helpful?

I sort of agree with the original posters point of view. If people are unwilling to read important dialogs and end up losing their work then screw them. It isn't a flaw in the design of the OS or application.

Reply Score: 1

RE[2]: Refutation
by j-s-h on Tue 30th May 2006 11:41 UTC in reply to "RE: Refutation"
j-s-h Member since:

When the user doesn't want the last changes, the save dialog is completely unnecessary, because undo works much better.

Reply Score: 1

There is no "one size fits it all"
by oggy on Wed 24th May 2006 19:07 UTC
Member since:

I dont think the default guis of Windows or MacOS X
or Linux should get less intuitive. But they should give the power user more choices for customizing the gui.
In my point of view KDE is a good positive example for that. You can just use it like it is, or customize it until it has nearly nothing in common with the default look and feel.
I personally use the ion3 wm with kde (that is, the kde apps), which allows you to use the keyboard for everything.
All in all I think the vendors should distinquish more between power users and beginners, for my mom windows is still far to unintuitive.


Reply Score: 2

by Dave_K on Wed 24th May 2006 19:50 UTC
Member since:

Personally I've never had a problem with UIs being dumbed down, as long as they're still customisable. Software with a simple default UI for new users can still provide access to more advanced features. They may be a little more hidden, but surely that's not really a problem for more experienced users who are happy to look through menu options and preferences? It seems like a small price to pay for an interface that's much more approachable for novices.

The web browser Opera is a good example. Look at all the criticism of it's UI a few years ago; many people switching from simpler browsers hated how "cluttered" the UI was, and reviews regularly complained about the complexity. With more recent versions the UI has been heavily simplified, with most of the features hidden by default. Despite that no features have actually been removed, they're still easily discoverable for people who care to look, and as the UI is quick and easy to customise, experienced Opera users can bring back the features they use in seconds.

Another example is the remote control with my TV/DVD-recorder. On it's surface it just has a small number of large buttons to provide quick access to the most common options. You flip it open to access more advanced recording and tuning features that generally aren't in regular use. To me that kind of design is a perfectly good compromise when creating an interface for a device that has a large number of functions.

Of course in an ideal world products would be intuitive enough not to need that kind of compromise, but when dealing with something that's complex and feature rich, that's a very difficult challenge.

Reply Score: 2

All journeys begin with a first step
by RGCook on Wed 24th May 2006 21:54 UTC
Member since:

The author puts a lot of effort and thought into this article to make the point - I think - that access to functionality is limited by dumbing down the interface rather than having the interface match the level of power the machine inherently provides someone with the intelligence to use it. But therein lies the flaw in this logic. Intelligence and knowledge of computers are different things. We all learn based on abstraction of applied experience. Faced with a new situation, we immediately try to make sense of it based on prior experience. Even the most gifted person, when faced with a computer for the first time, will look at it and say, "What's this?" Then you will need to train them how to use it. In time, their knowledge will grow and they can apply their intelligence to solve problems, and create great works with it. But the access to this power starts at a fundamental level and the machine must afford an interface to its power that is accessible on all levels.

This is a good example of another OSNews expert forgetting that he/she is not the average user. And while the requisite reference to Joe Sixpack is ironically included, Joe wouldn't want to have a beer with the author of this article. Maybe Linus or Bill G would though.

Reply Score: 3

RE: Deus Ex Machina
by ma_d on Wed 24th May 2006 22:20 UTC
Member since:

Human dialogue, especially in English, is incredibly imprecise, slow, and complex to master. It would be a terrible way to interface with a computer under important and serious contexts.

Sure, it'd be nice to dictate your paper (which you can already do, dragon speak, etc). And it'd be nice to talk to bring up your e-mail; but I dare you to write an SQL query this way... And if you think that's hard (because SQL has a remarkably spoken language like semantic) try to write a C program via speaking..

Or better yet, tell your computer where to find a document. "Yea, that's in slash a-q-f-g capital X slash boborama that's b-o-b-o-r-a-m-a no capitals and it's named jill dot text t-x-t." Of course, the computer can algorithmically narrow your choices and allow you to not be terribly precise and this is fine; right up until precision is exactly what you need and no amount of guessing is going to get it.

The interesting part of comparing computer interaction to conversing with your fellow comes into affect when you consider that conversation is horribly innaccurate and usually misunderstood... Take the differences between men and women (typically) for one! How cliché is it for men to complain that women read into what they say; because the woman is reading his speech pattern and body language too and the man is pretending that's unimportant. Shall computers read body language too? Will they try and calm you down when they see you're angry and keep you from deleting all your files because they "know you wouldn't ever want to do that."

This comes down to a basic argument of is computing apt as a tool or as a fellow worker. Obviously I see it as a tool and you see it as a worker. It's working as a tool right now; the worker part is researched a lot (although maybe not as much by proportion as it was 30 years ago) in AI labs.

The last thing I'd ever want to see is being required to talk to my computer. Now, having it read my mind, that's much more intriguing.

Reply Score: 4

Deep Purple...
by Sphinx on Thu 25th May 2006 00:39 UTC
Member since:

In ages past when spells were cast
in a time of men and steel
we were taught no special things
for it was all done by feel.

Reply Score: 3

It's all in the little things
by cerbie on Thu 25th May 2006 00:45 UTC
Member since:

For basic file management, we're good. Explorer, Finder, Konqueror, and even Thunar are quite good.

I can do just about anything well in KDE, XFCE, E17, and Explorer (as of WinXP). The trouble comes in not telling me useful info (it takes no more real space to give me MB/s, ETA per file, etc., like Konueror when moving/copying files), or adding too many steps (Safely remove device wizard--need I say more?), or even using file management that only works within a small subset of applications (network shares in OS X, FI).

All of those types of things can add minutes to simple tasks. That's without even invoking Fitts' Law!

You need to learn how to operate applications your computer. Some people are just stupid (as opposed to merely ignorant--this includes non-techies who buy a crappy Dell because it's half the price of a Mac, when they need a Mac), and there's no help for them. The rest of us can learn.

Streamlining does not need to be at the expense of options. An advanced tab, or little arrow things that Gnome and OS X use, are good ways to handle it. We just need people to tell the geeks making it work, "this is stupid, it should work like this." Then get that working well before adding too many more features.

Unfortunately, it's the pretty stuff that sells at the store.

Reply Score: 1

As someone else mentioned...
by phibxr on Thu 25th May 2006 08:26 UTC
Member since:

...ION3 can do most of the things the author asks for. It's in the repositories of Ubuntu, just do 'sudo apt-get install ion3', log out and choose Ion3 as your environment.

Make sure to read the manual when it starts though. And if you are like me, you'll want to do 'apt-get install gtk-theme-switch' too, or else GTK-programs will use the default GTK-theme it seems. At least here.

Reply Score: 1

things shouldn't be idiot proof
by maxmg on Thu 25th May 2006 08:35 UTC
Member since:

I'm glad to see that some people have come to the defence of the author. Not least because most of the detractors seem to be accusing him of wanting harder to use computers. I'd say it reads more that he'd like easier to use computers _once you know what you're doing_, and the LaTeX analogy above emobdies this perfectly. Indeed, one of the author's points was that the user should remember that meta-s invokes the save command, and what is more intuitive and simpler than 's for save'? (OK, I'm a pine user so I might not be the best judge of that anymore.)

To focus on a slightly different aspect, rather than thinking about power users: why is it that the notional computer buyer is not expected to read any manuals for their spanking new top of the range technology, technology that is incredibly powerful and could see the naive user phished, accessing dubious material, deleting all their important data, keeping their credit card numbers in a non-secure fashion and who knows what else, yet the same person on buying a toaster will get more detailed instructional material? Is burnt bread that much more threatening?

So make the interface as '(non)-inuitive' as you want, make it mouse driven if you must, but I'd like to see people moving away from the mindset that you should be able to take it home plug it in and be up and running in an hour if you've never used a computer before. We don't apply that logic to any other product, so why something as expensive, powerful, and potentially damaging as a computer?

Reply Score: 3

Member since:

Has anyone tried to teach how to use computers? I have tried to teach my aunt, and then I realized how difficult to use computers are.

What would be a nice computer experience? it would be one like this:

the user presses an easily accessible button to switch on the computer; monitor etc are switched on.

a nice graphic greets you while the O/S loads.

The opening screen presents a list of functions in a vertical menu that occupies the screen: write a document, find a document, play a game, etc (each program installs its own category here).

The user selects 'find'. The computer responds with another screen: what to find? the user writes "find all the documents between yesterday and today". The computer responds with a list of documents.

While the above has happened, the previous screen has been minimized with an animation to a little transparent icon at one of the screen sides.

Then the user wants to checkout emails. He clicks the initial screen icon, then selects to view emails. The find screen is again minimized somewhere. The screen contains email information. The user clicks an email. The email list is minimized while the screen is occupied with the email and some nice translucent animated options using the 3d card (ala Spore). The user clicks the 'forward' button and then a list of contacts comes up. The user selects the contacts and presses ok.

Then the user wants to see the job tasks. He goes onto the first screen, selects 'tasks' and the new tasks come up. Then he proceeds to do the tasks etc.

Now let's see reality:

the user tries to find the power button. Where is it? he pushes the monitor button, but the only thing he sees is the green power on monitor led. He presses the button again. Then he realizes the computer is hidden under the desk. He presses the button.

A nice black & white screen with some strange messages come up. He then waits while more strange messages come up. Then he see a message about 'windows'.

Then WinXP finally boots.

The user wants to find the documents written from yesterday. He clicks 'start', but nothing happens. WinXp has not actually finished booting!

Then he clicks 'start' again after a few seconds. The start menu opens. The user sits there gazing at the marvellous invention called 'winXp' menu. He then clicks 'search'.

The search menu talks about finding files, folders, printers and outlook. It does not say you can find documents. Therefore the user clicks away to try another way.

The user searches the start menu for 'find documents', but he finds nothing. Then he realizes that a document is a file, so he goes back to 'search'. Then a little dog comes up and asks him if he wants to find 'documents'.

The user is happy to have found how to search for documents. The options say 'within last week', so he clicks that. He then enters in the box that says 'document name': "my vacations". He then clicks 'search'.

A window comes up empty.

But the user is sure to have given the name 'my vacations' to the document. What happened? 'my vacations' was the word document's title, not the file name. The filename was 'this summer', because the user's text starts with 'this summer' and msword proposed it as a filename when the file was saved.

Then the user wants to checkout emails. He goes to the start menu and selects 'email'. Then he wants to find emails. There isn't a search option anywhere, so he goes back to the start menu. He then sees that he can not search emails, because the start menu says he can search for 'files, folders, people, printer and outlook'.

Then the user wants to check his job tasks. There is no such thing as tasks, but he remembers he has to open 'internet explorer'. He opens IE, then he has to type some weird things.

To cut the long story sort, computers suck, because operating systems suck. Computer usage is not human-centric, but machine-centric.

Reply Score: 1

hobgoblin Member since:

hmm, symphony anyone?

still, i fully agree with the troublesome interaction above.

hmm, i recall a program called haystack that some MIT people was working on.

the main problem is, like i think the article pointed out, that the computer isnt buildt by one company or person, but by many.

still: with the new destop search stuff that apple and others have introduced lately, your example about searching may be a bit flawed...

Reply Score: 1

Sphinx Member since:

I got the exact ideal experience you described with my sparcstation 5 and Solaris 2.6 and again came very close with an early mac II, yet neither are the world market share leader. The problem is not the computer.

Reply Score: 1

Aint for everybody...
by Sphinx on Thu 25th May 2006 18:21 UTC
Member since:

I've always felt UNIX and C should only be taught on the street corner, like sex.

Reply Score: 1