首页> 美国卫生研究院文献>PLoS Clinical Trials >How Bodies and Voices Interact in Early Emotion Perception
【2h】

How Bodies and Voices Interact in Early Emotion Perception

机译:身体和声音如何在早期情绪感知中相互作用

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.
机译:成功的社交交流极大地依赖于对他人身体和声音表达的正确解释。两者都可以提供情感信息,并且经常同时发生。但是,它们之间的相互作用几乎没有被研究过。使用脑电图,我们调查了在听觉和视觉感知中它们的神经相互作用背后的时间发展。特别是,我们测试了这种相互作用是否符合逆向有效性等多感官整合原则下的真正整合。情感发声被嵌入到低水平或高水平的噪音中,并带有或不带有匹配情感身体表情的视频片段。在高噪声和低噪声条件下,视听刺激均会导致听觉N100振幅降低。但是,仅在高噪声下,N100在视听中达到顶峰的时间要比听觉状态早,这暗示了反效果原理所预测的促进作用。同样,与中性视听刺激相比,我们观察到较早的N100对情绪的反应高峰。在单峰听觉条件下并非如此。此外,发声后200–400 ms,调制了主要反映生物运动感知的β波段振荡抑制(15–25 Hz)。尽管在情感刺激方面,高与低噪音水平相比,视听和音频刺激之间的抑制差异更大,但对于中性刺激则没有观察到这种差异。该观察结果符合逆效原理,并暗示了情感内容对整合的调节。总体而言,结果表明,生态有效的复杂刺激(如连接的身体和声音表达)在加工的早期就得到了有效整合。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号