Animatronic figures provide key show effects in the entertainment and theme park industry by simulating life-like animations and sounds. There is a need for interactive, autonomous animatronic systems to create engaging and compelling experiences for the guests. The animatronic figures must identify the guests and recognize their status in dynamic interactions for enhanced acceptance and effectiveness as socially interactive agents, in the general framework of human-robot interactions. The design and implementation of an interactive, autonomous animatronic system in form of a tabletop dragon and the comparisons of guest responses in its passive and interactive modes are presented in this work. The purpose of this research is to create a platform that may be used to validate autonomous, interactive behaviors in animatronics, utilizing both quantitative and qualitative analysis methods of guest response. The dragon capabilities include a four degrees-of-freedom head, moving wings, tail, jaw, blinking eyes and sound effects. Human identification, using a depth camera (Carmine from PrimeSense), an open-source middleware (NITE from OpenNI), Java-based Processing and an Arduino microcontroller, has been implemented into the system in order to track a guest or guests, within the field of view of the camera. The details of design and fabrication of the dragon model, algorithm development for interactive autonomous behavior using a vision system, the experimental setup and implementation results under different conditions are presented.
展开▼