The general idea is that Ubuntu could be more aware of its physical surroundings, with the user interface responding to how the user behaves in front of his display. This would need to make use of either cameras or proximity sensors (or both), and already, the guys at Canonical have come up with some interesting use cases.
For instance, a video playing in a windowed state could switch into a fullscreen state when the user leans back from his display. You could use a simple hand gesture or nod to show the launcher, or, go a bit further into the future, and tilt windows when users aren't properly facing the screen (when using 3D displays, obviously).
The most interesting use case, as far as I'm concerned, is to automatically display notifications using a large font (fullscreen, preferably) when the user is not sitting in front of his computer - maybe he's sitting on the couch or walking around. This way, notifications can be read from afar. Giordano already experimented with such a feature back in 2006.
A quick prototype was created for all this.
Interesting stuff this.