The Interactive Computing Lab team in ENAC, Toulouse, has been successful in collaboration with Linux developers in bringing native multi-touch support to Linux. While there is Multi-Pointer X in the mainline X.Org server (to be released with X.Org 7.5/X Server 1.7), we now have multi-touch support to be able to handle gestures and other actions. This multi-touch support requires the Linux 2.6.30 kernel. How this works right now is by reading the input events, translating them into multi-touch events using simple gesture recognition, and then sending D-Bus messages over to Compiz to produce multi-touch effects. Right now the code is deemed just a proof-of-concept, but they are currently working on a better implementation.
Linux Gets Native Multitouch Support
Submitted by boulabiar 2009-06-15 X11 4 Comments
definitely a first step, but maybe it won’t take too long to start to appear on distros and be considered when designing the user interfaces… also, by this time, it may also be worth performance-wise.
I also wish this project give us a nice, clean and hardware-agnostic solution (not tried to some driver/hardware vendor), that could actually be used by the community without forks and parallel projects trying to accomplish the same thing…
I remember reading something about a crap load of patents Apple had for multi-touch when they first released their iPhone. Yet it seems like everyone is implementing it. Microsoft has their “table” or whatever they call it. Google seems to have stayed away from it however.
Is this a problem? Is apple just waiting for the right time to sue?
Not to sure if it is the same thing but I am using Fedora 11 on my netbook which has a multi-touch pad – it seems to be working in its detection of two finger strolling and so forth. I assume that this report is further development that is a lot more advanced compared to what I am able to do now.