首页> 外文会议>10th Western Pacific Acoustics Conference. >Multimodal interaction of auditory spatial cues and passive whole-body movement through virtual space
【24h】

Multimodal interaction of auditory spatial cues and passive whole-body movement through virtual space

机译:听觉空间线索与通过虚拟空间的被动全身运动的多模式交互作用

获取原文
获取原文并翻译 | 示例

摘要

When multisensory stimulation is coordinated within a comprehensive simulation, the resulting multimodal display can become so entirely convincing that it can create an experience of the observer's travel through a virtual environment. A great deal of attention has been paid to coordinated display within the auditory and visual modalities, but even the best of such bimodal simulations may fail to produce satisfying results when the user is intended to move through a virtual world. Although visual information strongly affects self motion perception, there are situations in which auditory cues alone are available to induce perceived self motion in observers. This paper will summarize our recent research efforts concerning the interaction between auditory spatial information and passive whole-body movement in self motion perception. Firstly, we describe the effect of postural information on the perceived velocity of moving sound sources, regarding which our obtained results suggested that the strongest multimodal interaction occurs when auditory information and postural variation are well matched. Secondly, the temporal synchrony between passive whole-body motion and auditory spatial information was investigated via a multimodal time-order judgment task. Our results suggested that sensory integration of auditory motion cues with whole-body movement cues could occur over an increasing range of intermodal delays as the velocity of virtual sound sources motion was decreased. The results of these two studies may be of general interest to those researching multimodal interaction, but may also be useful to those developing next generation multimodal display systems.
机译:当在综合模拟中协调多感官刺激时,所得的多模式显示将变得完全令人信服,以至于可以创建观察者在虚拟环境中的旅行体验。对于听觉和视觉模态中的协调显示已经引起了极大的关注,但是当用户打算在虚拟世界中移动时,即使是最好的这种双峰模拟也可能无法产生令人满意的结果。尽管视觉信息强烈影响自我运动知觉,但在某些情况下,听觉提示仅可用于在观察者中诱发感知的自我运动。本文将总结我们最近关于自觉运动中听觉空间信息与被动全身运动之间相互作用的研究成果。首先,我们描述了姿势信息对移动声源感知速度的影响,据此,我们的研究结果表明,当听觉信息和姿势变化完全匹配时,最强的多峰相互作用发生。其次,通过多模式时间顺序判断任务研究了被动全身运动与听觉空间信息之间的时间同步。我们的结果表明,随着虚拟声源运动速度的降低,听觉运动提示与全身运动提示的感觉整合可能会在联运延迟的增加范围内发生。这两项研究的结果可能是那些研究多模式交互作用的人普遍感兴趣的,但也可能对那些开发下一代多模式显示系统的人有用。

著录项

  • 来源
  • 会议地点 Beijing(CN);Beijing(CN)
  • 作者单位

    Research Institute of Electrical Communication and Graduate School of Information Sciences,Tohoku University,2-1-1 Katahira,Aoba-ku,Sendai,980-8577,Japan;

    Faculty of Architecture,Design and Planning,University of Sydney,NSW 2006 Australia;

    Research Institute of Electrical Communication and Graduate School of Information Sciences,Tohoku University,2-1-1 Katahira,Aoba-ku,Sendai,980-8577,Japan;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 声学;声学;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号