Gestural interaction in the pervasive computing landscape
e & i Elektrotechnik und Informationstechnik, Springer-Verlag Wien, Vol. 124, No. 1-2, pp. 17-25, February 2007.
Pervasive computing has postulated to invisibly integrate technology into everyday objects in such a way, that these objects turn into smart things. Not only a single object of this kind is supposed to represent the interface among the "physical world" of atoms and the "digital world" of bits, but a whole landscapes of them. The interaction with such technology rich artefacts is supposed to be guided by their affordance, i.e. the ability of an artefact to express the modality of its appropriate use. We study human gesticulation and the manipulation of graspable and movable everyday artefacts as a potentially effective means for the interaction with smart things. In this work we consider gestures in the general sense of a movement or a state (posture) of the human body, as well as a movement or state of any physical object resulting from human manipulation. Intuitive "everyday"-gestures have been collected in a series of user tests, yielding a catalogue of generic body and artefact gesture dynamics. Atomic gestures are described by trajectories of orientation data, while composite gestures are defined by a compositional gesture grammar. The respective mechanisms for the recognition of such gestures have been implemented in a general software framework supported by an accelerometer-based sensing system. Showcases involving multiple gesture sensors demonstrate the viability of implicit embedded interaction for real life scenarios.