The article explains that until not too long ago, we were living in a world of Windows-dominated user interfaces, which presented a standard set of user interface elements (widgets). According to the author, every application was full of these widgets, "and nothing else". He claims that the user interface world was not one to be innovative in.
Read Write Web has an interesting article on the concept of the contextual user interface. A contextual user interface - as the name implies - is an interface which adapts to the current wishes of its users, the context. The interface will change according to the actions the user takes; present a set of minimal options, and show other options as the user goes along. While the article makes some good points, it also contains some generalisations that I find rather debatable.
User interface was not the place to be innovative. It was considered unorthodox and even dangerous to present the interface in non-standard ways because everyone believed that users were, to be frank, stupid, and wouldn't want to deal with anything other than what they were used to.He continues by saying that the recent wave of user interface innovation is proving that the users-are-stupid train of thought is losing speed. "Thanks to Apple, we have seen a liberating movement towards simplistic, contextual interfaces." The problem with this train of thought is that it blindly assumes that these contextual user interfaces are, by definition, superior. And I'm not so sure of that, because these contextual user interfaces pose their own problems. Whether you present all your options up front, or whether you hide them, it's both not an ideal solution. Present your user with all options under the sun, and they're overwhelmed. Present your users with only the basic of functionality, and hide the rest away in menus and dialogs - or not present them at all - and you're bound to annoy certain users who are seeing their pet features tucked away in some far away place, or worse yet, axed altogether, all in the name of simplicity. Especially GNOME, for instance, has had some serious problems with its users concerning the removing of features, but even Mac OS X itself, seen as some sort of UI guidebook from heaven by the author of the article, suffers from this problem. Discoverability is just as much a problem in Mac OS X as it is in Windows - being overwhelmed, or underwhelmed - both can lead to "where the heck is my feature?" For a small application such as an audio player or note pad application it's easy to only show the most used features - but go to anything more complicated, and you are sure to hit problems at some point. Something like Word or PowerPoint has ten million billion different features, and millions and millions of users - how on earth are you going to determine which of those ten billion million features are the ones you want to show by default, and which are the ones you wish to hide? A common saying thrown around on the internet is that 90% of the people use only 10% of Word's features - but the problem is that those 10% are different for each individual user. So, what to do you present as the few default options, and which do you hide only to be revealed upon user request? The other extreme isn't much better either, of course, as the article explains. Just presenting every possible feature directly to the user will only confuse him or her. Microsoft tried to remedy this with the ribbon interface in Office 2007, but despite the fact that I like it, the resistance to it has been fairly vocal. The issue here is that despite its obvious advantages, the contextual user interface is, sadly, not the holy grail of user interface design. For simple applications, yes, it will probably work - but write a contextual user interface for an application that's a bit more feature laden and the contextual user interface itself can become highly frustrating. To me, the contextual user interface is a very welcome addition - but it's not the silver bullet for UI confusion. As always, moderation and balance are key.