Linked by Thom Holwerda on Thu 25th Oct 2007 16:52 UTC
Graphics, User Interfaces This is the first article in a series on common usability and graphical user interface related terms. On the internet, and especially in forum discussions like we all have here on OSNews, it is almost certain that in any given discussion, someone will most likely bring up usability and GUI related terms - things like spatial memory, widgets, consistency, Fitts' Law, and more. The aim of this series is to explain these terms, learn something about their origins, and finally rate their importance in the field of usability and (graphical) user interface design. We start off with spatial memory - my personal favourite.
Thread beginning with comment 280836
To view parent comment, click here.
To read all comments associated with this story, please click here.
RIchard James13
Member since:
2007-10-26

If you are not using a mouse but a touch screen instead does Fitt's Law still apply? I wouldn't think so because the mouse pointer stops moving when it gets to the edge of the screen whereas your hand does not. I would think that with both mouse and touchscreen interfaces that the size of the widgets is more important than Fitt's Law anyway. Consistent spacial layout would come before Fitt's as well. Who cares if it is hard to reach for a button if the button is not in the same place every time anyway.

Reply Parent Score: 2

hobgoblin Member since:
2005-07-06

i dont know if there was any experience data collected on it but i recall that in the 94 winter olympics the computer systems where supposed to be touch screen controlled.

and i kinda agree with you, the problem with the mouse is that we do not have awareness of its locations at all times. hell, i know i have the habit of spinning it in circles when it need to find it before i do anything.

with our hands we have built in awareness, thats why we can in theory reach out of view and still have a good idea of where our hands are.

i also wonder what a general feedback system on a mouse would be like. say having the left button jump each time one move over the edge of a ui element.

right now, our sense of where the mouse is is based to much on sight. we cant use hearing, smell, touch or anything like that to tell where the mouse is in relation to other things.

Reply Parent Score: 2