首页> 外文期刊>The Journal of the Acoustical Society of America >Measures of auditory-visual integration in nonsense syllables and sentences
【24h】

Measures of auditory-visual integration in nonsense syllables and sentences

机译:废话音节和句子中听觉与视觉的融合

获取原文
获取原文并翻译 | 示例
           

摘要

For all but the most profoundly hearing-impaired (HI) individuals, auditory-visual (AV) speech has been shown consistently to afford more accurate recognition than auditory (A) or visual (V) speech. However, the amount of AV benefit achieved (i.e., the superiority of AV performance in relation to unimodal performance) can differ widely across HI individuals. To begin to explain these individual differences, several factors need to be considered. The most obvious of these are deficient A and V speech recognition skills. However, large differences in individuals' AV recognition scores persist even when unimodal skill levels are taken into account. These remaining differences might be attributable to differing efficiency in the operation of a perceptual process that integrates A and V speech information. There is at present no accepted measure of the putative integration process. In this study, several possible integration measures are compared using both congruent and discrepant AV nonsense syllable and sentence recognition tasks. Correlations were tested among the integration measures, and between each integration measure and independent measures of AV benefit for nonsense syllables and sentences in noise. Integration measures derived from tests using nonsense syllables were significantly correlated with each other; on these measures, HI subjects show generally high levels of integration ability. Integration measures derived from sentence recognition tests were also significantly correlated with each other, but were not significantly correlated with the measures derived from nonsense syllable tests. Similarly, the measures of AV benefit based on nonsense syllable recognition tests were found not to be significantly correlated with the benefit measures based on tests involving sentence materials. Finally, there were significant correlations between AV integration and benefit measures derived from the same class of speech materials, but nonsignificant correlations between integration and benefit measures derived from different classes of materials. These results suggest that the perceptual processes underlying AV benefit and the integration of A and V speech information might not operate in the same way on nonsense syllable and sentence input.
机译:对于除了最严重的听力障碍(HI)以外的所有个人,听觉(AV)语音已被证明始终比听觉(A)或视觉(V)语音提供更准确的识别。但是,在不同的HI个体中,获得的AV收益的数量(即AV性能相对于单峰性能的优越性)可能存在很大差异。要开始解释这些个体差异,需要考虑几个因素。其中最明显的是A和V语音识别技能不足。但是,即使将单峰技能水平考虑在内,个人的AV识别分数仍存在较大差异。这些剩余的差异可能归因于整合A和V语音信息的感知过程的操作效率不同。目前尚没有公认的整合过程度量。在这项研究中,比较了几种可能的整合措施,分别使用了一致的和不一致的AV无意义音节和句子识别任务。检验了整合度之间的相关性,以及每个整合度与独立度对无意义音节和噪音句子的AV收益之间的相关性。使用无意义音节的测试得出的积分测度之间存在显着的相关性。通过这些措施,HI受试者通常显示出较高的整合能力。从句子识别测验得到的综合测度也彼此显着相关,但与从无意义音节测验得到的测度没有显着相关。同样,发现基于无意义音节识别测试的AV收益测度与基于涉及句子材料的测验的收益测度没有显着相关性。最后,从同一类语音材料获得的视听整合与收益测度之间存在显着相关性,而从不同类别的材料得到的整合与收益测度之间无显着相关性。这些结果表明,在无意义的音节和句子输入上,AV利益和A,V语音信息集成的感知过程可能不会以相同的方式运行。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号