首页> 外文期刊>Human brain mapping >Giving speech a hand: gesture modulates activity in auditory cortex during speech perception.
【24h】

Giving speech a hand: gesture modulates activity in auditory cortex during speech perception.

机译:向语音提供帮助:手势在语音感知过程中调节听觉皮层的活动。

获取原文
获取原文并翻译 | 示例
           

摘要

Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture-a fundamental type of hand gesture that marks speech prosody-might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the leftsuperior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.
机译:在面对面的交流中查看手势会影响语音感知和理解力。尽管手势在社交互动中起着可见的作用,但人们对大脑如何将手势与共现语音相结合的了解还很少。在这里,我们使用功能磁共振成像(fMRI)和一种生态学上有效的范例来研究拍打手势(一种标记语音韵律的基本手势)如何在神经水平上影响语音感知。受试者在听自发产生的语音并伴有拍打手势,胡言乱语的动作或静止的身体时进行了功能磁共振成像。作为额外的控制条件,对象还观看了节拍手势,胡言乱语的动作或静止的身体,都没有讲话。为验证手势会影响言语感知的行为证据,与伴奏时相比,伴有节拍手势的双侧非主要听觉皮层显示出更大的活动性。此外,当语音伴有节拍手势时,左上颞回/沟显示出比语音伴有胡言乱语的手动作更强的活动。最后,右颞颞被确定为心跳手势和语音的推定多感官整合位点(即,此处对伴随心跳手势的语音的响应活动大于单独对语音和单独心跳手势的求和响应),这表明区域可能在合成语音和手势的节奏方面都至关重要。综上所述,这些发现表明处理语音和手势的共同神经基质可能反映了它们在社交互动中的共同交流作用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号