首页> 外文期刊>Human Factors >The Effect of Walking on Auditory Localization, Visual Discrimination, and Aurally Aided Visual Search
【24h】

The Effect of Walking on Auditory Localization, Visual Discrimination, and Aurally Aided Visual Search

机译:走路对听觉本地化,视觉歧视和验证性视觉搜索的影响

获取原文
获取原文并翻译 | 示例
       

摘要

Objective: The present study was designed to examine the impact that walking has on performance in auditory localization, visual discrimination, and aurally aided visual search tasks. Background: Auditory localization and visual search are critical skills that are frequently conducted by moving observers, but most laboratory studies of these tasks have been conducted on stationary listeners who were either seated or standing during stimulus presentation. Method: Thirty participants completed three different tasks while either standing still or while walking at a comfortable self-selected pace on a treadmill: (1) an auditory localization task, where they identified the perceived location of a target sound; (2) a visual discrimination task, where they identified a visual target presented at a known location directly in front of the listener; and (3) an aurally aided visual search task, where they identified a visual target that was presented in the presence of multiple visual distracters either in isolation or in conjunction with a spatially colocated auditory cue. Results: Participants who were walking performed auditory localization and aurally aided visual search tasks significantly faster than those who were standing, with no loss in accuracy. Conclusion: The improved aurally aided visual search performance found in this experiment may be related to enhanced overall activation caused by walking. It is also possible that the slight head movements required may have provided auditory cues that enhanced localization accuracy. Application: The results have potential applications in virtual and augmented reality displays where audio cues might be presented to listeners while walking.
机译:目的:本研究旨在检验走路在听觉本地化,视觉歧视和验证视觉搜索任务方面的影响。背景:听觉本地化和视觉搜索是通过移动观察员经常进行的关键技能,但是对这些任务的大多数实验室研究已经在刺激介绍期间坐在或站立的固定听众上进行。方法:三十人参与者完成了三个不同的任务,同时站在跑步机上的舒适自选步伐或在跑步机上行走时仍然存在:(1)听觉本地化任务,在那里他们确定了目标声音的感知位置; (2)视觉歧视任务,在那里他们在直接在听众前面的已知位置识别出呈现的视觉目标; (3)验证视觉搜索任务,在那里它们识别出在多种视觉干扰的存在中呈现的可视目标,其在于隔离或与空间上分配的听觉提示结合。结果:正在走路的参与者进行听觉本地化和验证的视觉搜索任务明显快于那些站立的人,没有准确性损失。结论:在该实验中发现的改进的AULALLY辅助视觉搜索性能可能与通过行走引起的增强的整体激活有关。还可以提供所需的轻微头部运动,可以提供增强了本地化准确性的听觉提示。应用程序:结果具有虚拟和增强现实的潜在应用,在步行时,音频线索可能会显示给侦听器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号