In June 2009 we had some very good news about the integration of multitouch events support inside the Linux kernel. Since then, many multitouch device drivers were developed, mostly in collaboration with LII-ENAC, to take advantage from this. All the work was kernel-based, and multitouch supports needs more components to be added in a stack to get multitouch working out of the box. Canonical got interested in providing the needed user experience for multitouch by developing a new gesture engine that recognizes the grammar of natural hand gestures and provide them upstream in the stack as new events.
Many other components from the uTouch Framework are added to standardize the events gathered from devices as we have ones that do finger tracking and others which don’t. With the help of many people from the community, a protocol is being discussed in the xorg-devel mailing-list and would be ready for Maverick.
Mark Shuttleworth described the gesture grammar in his blog as “rather than single, magic gestures, we’re making it possible for basic gestures to be chained, or composed, into more sophisticated ‘sentences’. The basic gestures, or primitives, are like individual verbs, and stringing them together allows for richer interactions”. More information about this can be found in this Google document.
The stack will make the hardware vendors’ jobs easier as it does finger tracking and delivers a generalized gesture recognition system, which means that Ubuntu can become the standard platform for instant multitouch application prototyping and use as most components of the stack are already here and licensed under the GPLv3 and the LGPLv3.
The multitouch support is tested on hardware like the NTrig digitizers but it works on some multitouch touchpads as well. Multitouch device manufactures can contact the developers of uTouch to test if their hardware is supported or if it still needs some minor tweaking.