Many status-quo interfaces for tablets with pen + touch input capabilities force users to reach for device-centric UI widgets at fixed locations, rather than sensing and adapting to the user-centric posture.
To address this problem, we propose sensing techniques that transition between various nuances of mobile and stationary use via postural awareness. These postural nuances include shifting hand grips, varying screen angle and orientation, planting the palm while writing or sketching, and detecting what direction the hands approach from.
The video demonstrates some incredibly useful techniques, but as always, the devil is not just in the details, but also in implementation. Nothing shown in the video seems particularly complicated to implement using current technology, but UI elements that move around based on how you are holding or interacting with the device can be either incredibly intuitive – or downright infuriating.