This paper introduces Gesture Engine, an animation system that synthesizes human gesturing behaviors from augmented conversation transcripts using a database of high-level gesture definitions. An abstract scripting language to specify hand-arm gestures is introduced that incorporates knowledge from sign language research, psycholinguistics, and traditional keyframe animation. A new planning algorithm instantiates and adjusts gestures according to communicative context and temporal constraints obtained from a speech synthesizer. The system animates an MPEG-4 compliant skeleton using Body Animation Parameters.
展开▼