首页> 外文会议>Humaine Association Conference on Affective Computing and Intelligent Interaction >User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses
【24h】

User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses

机译:来自MEG和外围生理反应的用户以用户为中心的情感视频标记

获取原文

摘要

This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magneto encephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classification of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips.
机译:本文介绍了一种新的多模数据数据库和使用磁脑信号(MEG)脑信号和外周生理信号(水平EOG,ECG,Trapeius EMG)表征影响(价,唤醒和优势)的相关结果。我们尝试在电影和音乐视频剪辑中进行单次试验分类,采用从十八名参与者提取的情绪反应。本研究的主要结果是:(i)MEG信号有效地编码了情感观众响应,(ii)夹子唤醒更好地预测MEG,而外周生理信号对于预测价和(III)预测性能更有效与音乐视频剪辑相比,电影剪辑。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号