People naturally express themselves through facial gestures and expressions. Our goal is to build a facial gesture human-computer interface for use in robot applications. We have implemented an interface that tracks a person's facial features in real time (30 Hz). Our system does not require special illumination nor facial makeup. The work is focused on real-time face tracking using dedicated hardware based on template matching. Tracking using template matching suffers from changing shade and deformation or even disappearance of facial features. By using multiple Kalman filters we accurately predict and robustly track facial features. This is despite disturbances and rapid movements of the head (including both translational and rotational motion). Since we reliably track the face in real-time we are also able to recognise motion gestures of the face. Our system can recognise a large set of gestures ranging from "yes", "no" and "may be" to detecting winks, blinks and sleeping.
展开▼