首页> 美国卫生研究院文献>Frontiers in Neuroscience >Probing the time course of head-motion cues integration during auditory scene analysis
【2h】

Probing the time course of head-motion cues integration during auditory scene analysis

机译:探索听觉场景分析过程中头部动作提示整合的时间过程

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The perceptual organization of auditory scenes is a hard but important problem to solve for human listeners. It is thus likely that cues from several modalities are pooled for auditory scene analysis, including sensory-motor cues related to the active exploration of the scene. We previously reported a strong effect of head motion on auditory streaming. Streaming refers to an experimental paradigm where listeners hear sequences of pure tones, and rate their perception of one or more subjective sources called streams. To disentangle the effects of head motion (changes in acoustic cues at the ear, subjective location cues, and motor cues), we used a robotic telepresence system, Telehead. We found that head motion induced perceptual reorganization even when the acoustic scene had not changed. Here we reanalyzed the same data to probe the time course of sensory-motor integration. We show that motor cues had a different time course compared to acoustic or subjective location cues: motor cues impacted perceptual organization earlier and for a shorter time than other cues, with successive positive and negative contributions to streaming. An additional experiment controlled for the effects of volitional anticipatory components, and found that arm or leg movements did not have any impact on scene analysis. These data provide a first investigation of the time course of the complex integration of sensory-motor cues in an auditory scene analysis task, and they suggest a loose temporal coupling between the different mechanisms involved.
机译:对于听众来说,听觉场景的感知组织是一个困难但重要的问题。因此,有可能汇集了来自多种模态的线索以进行听觉场景分析,包括与主动探索场景有关的感觉运动线索。先前我们报道了头部运动对听觉流的强烈影响。流媒体指的是一种实验范式,其中听众听到纯音序列,并对他们对一个或多个主观音源(称为流)的感知进行评分。为了弄清头部运动的影响(耳朵的声音提示,主观位置的提示和运动的提示的变化),我们使用了机器人远程呈现系统Telehead。我们发现,即使声学场景没有变化,头部运动也会导致感知重组。在这里,我们重新分析了相同的数据以探究感觉运动整合的时间过程。我们显示,与声音或主观位置提示相比,运动提示具有不同的时程:与其他提示相比,运动提示更早,更短地影响了感知组织,对流媒体产生了连续的正面和负面影响。另一个实验控制了预期的自愿成分的影响,发现手臂或腿部的动作对场景分析没有任何影响。这些数据提供了对听觉场景分析任务中感觉运动线索的复杂整合的时程的首次调查,并且它们暗示了所涉及的不同机制之间存在松散的时间耦合。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号