...
首页> 外文期刊>Frontiers in Psychology >Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life
【24h】

Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life

机译:婴儿视听语音整合过程中的大脑反应和表情行为可预测婴儿第二年的听觉语音理解能力

获取原文
   

获取外文期刊封面封底 >>

       

摘要

The use of visual cues during the processing of audiovisual (AV) speech is known to be less efficient in children and adults with language difficulties and difficulties are known to be more prevalent in children from low-income populations. In the present study, we followed an economically diverse group of thirty-seven infants longitudinally from 6–9 months to 14–16 months of age. We used eye-tracking to examine whether individual differences in visual attention during AV processing of speech in 6–9 month old infants, particularly when processing congruent and incongruent auditory and visual speech cues, might be indicative of their later language development. Twenty-two of these 6–9 month old infants also participated in an event-related potential (ERP) AV task within the same experimental session. Language development was then followed-up at the age of 14–16 months, using two measures of language development, the Preschool Language Scale and the Oxford Communicative Development Inventory. The results show that those infants who were less efficient in auditory speech processing at the age of 6–9 months had lower receptive language scores at 14–16 months. A correlational analysis revealed that the pattern of face scanning and ERP responses to audiovisually incongruent stimuli at 6–9 months were both significantly associated with language development at 14–16 months. These findings add to the understanding of individual differences in neural signatures of AV processing and associated looking behavior in infants.
机译:已知在视听(AV)语音处理过程中使用视觉提示在儿童和有语言障碍的成年人中效率较低,并且已知在低收入人群的儿童中这种障碍更为普​​遍。在本研究中,我们追踪了一组经济上不同的小组,这些小组纵向调查了6-9个月至14-16个月大的37例婴儿。我们使用眼动追踪技术检查了在6-9个月大的婴儿进行AV语音处理期间,特别是当处理听觉和视觉语音提示的全同和不一致时,视觉注意力的个体差异是否可能指示他们后来的语言发展。这些6-9个月大的婴儿中有22名还参加了同一实验阶段的事件相关电位(ERP)AV任务。然后在14-16个月大时对语言发展进行跟踪,使用两种语言发展指标:学龄前语言量表和牛津交流发展清单。结果表明,那些在6–9个月时听觉语言处理效率较低的婴儿在14–16个月时的接受语言得分较低。相关分析显示,在6-9个月时,面部扫描和ERP对视听不一致的刺激的反应方式与14-16个月时的语言发展显着相关。这些发现加深了对婴儿AV处理的神经特征和相关外观行为的个体差异的理解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号