首页> 美国卫生研究院文献>eLife >Shared and modality-specific brain regions that mediate auditory and visual word comprehension
【2h】

Shared and modality-specific brain regions that mediate auditory and visual word comprehension

机译:共享和模当特定的大脑区域调解听觉和视觉词的理解

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Visual speech carried by lip movements is an integral part of communication. Yet, it remains unclear in how far visual and acoustic speech comprehension are mediated by the same brain regions. Using multivariate classification of full-brain MEG data, we first probed where the brain represents acoustically and visually conveyed word identities. We then tested where these sensory-driven representations are predictive of participants’ trial-wise comprehension. The comprehension-relevant representations of auditory and visual speech converged only in anterior angular and inferior frontal regions and were spatially dissociated from those representations that best reflected the sensory-driven word identity. These results provide a neural explanation for the behavioural dissociation of acoustic and visual speech comprehension and suggest that cerebral representations encoding word identities may be more modality-specific than often upheld.
机译:唇部运动携带的视觉语音是通信的一个组成部分。然而,在视觉和声学语音理解中介导的是相同的大脑区域的介导的近距离而尚不清楚。利用全比脑MEG数据的多变量分类,我们首先探测大脑代表声学和视觉上传达的单词身份。然后我们测试了这些感官驱动的陈述是预测参与者的试验理解的预测。理解相关的听觉和视觉语音的表示仅在前部和较差的前部区域中融合,并且在空间上解离那些最佳反映了感官驱动的词标识的那些表示。这些结果为声学和视觉语音理解的行为解离提供了神经解释,并表明编码字标识的脑表征可能比经常维护更具模态特异性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号