【24h】

PAD-based multimodal affective fusion

机译:基于PAD的多峰情感融合

获取原文

摘要

The study of multimodality is comparatively less developed for affective interfaces than for their traditional counterparts. However, one condition for the successful development of affective interface technologies is the development of frameworks for the real-time multimodal fusion. In this paper, we describe an approach to multimodal affective fusion, which relies on a dimensional model, Pleasure-Arousal-Dominance (PAD) to support the fusion of affective modalities, each input modality being represented as a PAD vector. We describe how this model supports both affective content fusion and temporal fusion within a unified approach. We report results from early user studies which confirm the existence of a correlation between measured affective input and user temperament scores.
机译:相对于传统接口,对于情感接口的多模态研究相对而言还不够完善。但是,成功开发情感接口技术的一个条件是开发实时多模式融合的框架。在本文中,我们描述了一种多模态情感融合方法,该方法依赖于维度模型Pleasure-Arousal-Dominance(PAD)支持情感模态的融合,每个输入模态均表示为PAD向量。我们描述了该模型如何在统一方法中同时支持情感内容融合和时间融合。我们报告了来自早期用户研究的结果,这些结果证实了所测得的情感输入与用户气质得分之间存在相关性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号