As people respond strongly to faces and facial features, both con-sciously and subconsciously, faces are an essential aspect of socialrobots. Robotic faces and heads until recently belonged to one of thefollowing categories: virtual, mechatronic or animatronic. As an orig-inal contribution to the field of human-robot interaction, I present theR-PAF technology (Retro-Projected Animated Faces): a novel robotichead displaying a real-time, computer-rendered face, retro-projectedfrom within the head volume onto a mask, as well as its driving soft-ware designed with openness and portability to other hybrid roboticplatforms in mind.The work constitutes the first implementation of a non-planar masksuitable for social human-robot interaction, comprising key elementsof social interaction such as precise gaze direction control, facial ex-pressions and blushing, and the first demonstration of an interactivevideo-animated facial mask mounted on a 5-axis robotic arm. TheLightHead robot, a R-PAF demonstrator and experimental platform,has demonstrated robustness both in extended controlled and uncon-trolled settings. The iterative hardware and facial design, details of thethree-layered software architecture and tools, the implementation oflife-like facial behaviours, as well as improvements in social-emotionalrobotic communication are reported. Furthermore, a series of evalua-tions present the first study on human performance in reading roboticgaze and another first on user’s ethnic preference towards a robot face.
展开▼