Facial expressions and emotions are two major processes underlying human behavior. They allow humans to manage their internal resources on relevant elements in the environment as well as to evaluate the emotional significance of such elements. Moreover, multidisciplinary evidence shows that these two processes interact extensively in the human brain. In fields such as human-computer interaction and artificial intelligence, computational models of either expressions or emotions have been developed to be included in cognitive agent architectures. However, modeling the interaction between expressions and emotions has been barely studied. In this paper, we propose a computational model for the interaction of facial expressions and emotional behavior. This model is designed to provide intelligent agents with adequate mechanisms to attend and react to emotions in the environment. The simulations performed demonstrate that the proposed model helps to provide virtual agents with more realistic behavior.
展开▼