首页> 外文期刊>Neuropsychologia >Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
【24h】

Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions

机译:触觉和视觉信息加速了实时二元互动中听觉语音的神经处理

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.
机译:语音不仅可以通过耳朵和眼睛来感知,还可以通过手来感知,通过与说话者面部的手动触觉可以感觉到语音手势。在当前的脑电图研究中,通过在听者和说话者之间的双向交互中比较听觉,视听和触觉语音感知期间的听觉诱发电位,研究了早期的跨模态交互作用。与以前的研究一致,与听觉语音感知相比,在视听过程中,早期听觉诱发的反应减弱并加快了。至关重要的是,在听觉触觉感知过程中还观察到了早期听觉诱发电位的潜伏期缩短。总而言之,这些结果表明,在二元互动中,面对面和面对面的实时语音感知过程中会出现早期的双峰互动。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号