Scale Detection for a priori Gesture Recognition

Gesture-based interfaces provide expert users with an efficient form of interaction but they require a learning effort for novice users. To address this problem, some on-line guiding techniques display all available gestures in response to partial input. However, partial input recognition algorithms are scale dependent while most gesture recognizers support scale independence (i.e., the same shape at different scales actually invokes the same command). In this paper, we propose an algorithm for estimating the scale of any partial input in the context of a gesture recognition system and how it can be used to improve users' experience with gesture-based systems.

This video illustrates how we used this algorithm to implement an augmented version of the OctoPocus gesture guide.