首页> 外文会议>International workshop on empathic computing >Neural Prediction of the User's Mood from Visual Input
【24h】

Neural Prediction of the User's Mood from Visual Input

机译:来自视觉输入的用户情绪的神经预测

获取原文

摘要

Affect-adaptive systems mutate their behavior according to the user's affective state. In many cases, such affective state is to be detected in a non-obtrusive way, i.e. through sensing that does not require the user to provide the system explicit input, e.g., video sensors. However, user affect recognition from video is frequently tuned to detect instantaneous emotional states, rather than longer term and more constant affective states such as mood. In this paper, we propose a non-linear computational model for bridging the gap between the recognized emotions of a person captured by a video and the overall mood of the person. For the experimental validation, emotions and mood are human annotations on an affective visual database that we created on purpose. Based on features describing peculiarities and changes in the user's emotional state, our system is able to predict the corresponding mood well above chance and more accurately than existing models.
机译:影响自适应系统根据用户的情感状态突变其行为。在许多情况下,这种情感状态将以非突兀的方式检测,即通过感测不要求用户提供系统显式输入,例如视频传感器。然而,用户影响视频的识别经常调整以检测瞬时情绪状态,而不是长期和更加持续的情感状态,例如情绪。在本文中,我们提出了一种非线性计算模型,用于弥合由视频捕获的人的认可情绪与人的整体情绪之间的差距。对于实验验证,情感和情绪是我们故意创建的情感视觉数据库的人为注释。基于描述特殊性的特征和用户情绪状态的变化,我们的系统能够预测良好的机会和比现有模型更准确地预测相应的情绪。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号