...
首页> 外文期刊>Frontiers in Neuroscience >Emotional Intensity Modulates the Integration of Bimodal Angry Expressions: ERP Evidence
【24h】

Emotional Intensity Modulates the Integration of Bimodal Angry Expressions: ERP Evidence

机译:情绪强度调节双峰愤怒表达的整合:ERP证据

获取原文

摘要

Integration of information from face and voice plays a central role in social interactions. The present study investigated the modulation of emotional intensity on the integration of facial-vocal emotional cues by recording EEG for participants while they were performing emotion identification task on facial, vocal, and bimodal angry expressions varying in emotional intensity. Behavioral results showed the rates of anger and reaction speed increased as emotional intensity across modalities. Critically, the P2 amplitudes were larger for bimodal expressions than for the sum of facial and vocal expressions for low emotional intensity stimuli, but not for middle and high emotional intensity stimuli. These findings suggested that emotional intensity modulates the integration of facial-vocal angry expressions, following the principle of Inverse Effectiveness (IE) in multimodal sensory integration.
机译:来自面部和声音的信息集成在社交互动中起着核心作用。本研究通过记录参与者在情绪强度变化的面部,声音和双峰愤怒表情上执行情感识别任务时记录的EEG,研究了情绪强度对面部-声音情感线索整合的调节。行为结果表明,愤怒和反应速度随着情感强度的提高而增加。至关重要的是,对于低情绪强度刺激,双峰表达的P2幅度大于面部表情和声音表达的总和,但对于中,高情绪强度刺激,则没有。这些发现表明,情绪强度会根据多模态感官整合中的逆向有效性(IE)原理,调节面部声音发怒表达的整合。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号