Hand gestures are essential components for efficient communication with increasingly complex interactive service robots. A novel approach is presented in this thesis for understanding hand gestures by associating concurrent gesture components with domain knowledge. Ambiguous or noisy hand gestures are corrected by utilizing the overall meaning of hand gesture phrases, shifting the traditional reliance on potentially complex syntactical analysis to an emphasis on semantic analysis, arriving at interpretations that make sense.; A set of concurrent gesture primitives are used to represent gestures. This simplifies scalability while reducing the complexity of training and recognition over conventional whole-hand approaches. These low-complexity gesture primitives are synthesized into concepts and associated with a knowledge-base using an approximate graph matching technique to determine their overall meaning. Experimental results demonstrate that this approach to hand gesture understanding is able to successfully disambiguate noisy hand gestures in sequences of three to five gestures, resulting in an average improvement in accuracy of 27% when compared against gesture recognition without understanding.
展开▼