首页> 外文期刊>Journal on multimodal user interfaces >The effects of spatial auditory and visual cues on mixed reality remote collaboration
【24h】

The effects of spatial auditory and visual cues on mixed reality remote collaboration

机译:空间听觉与视觉提示对混合现实远程合作的影响

获取原文
获取原文并翻译 | 示例

摘要

Collaborative Mixed Reality (MR) technologies enable remote people to work together by sharing communication cues intrinsic to face-to-face conversations, such as eye gaze and hand gestures. While the role of visual cues has been investigated in many collaborative MR systems, the use of spatial auditory cues remains underexplored. In this paper, we present an MR remote collaboration system that shares both spatial auditory and visual cues between collaborators to help them complete a search task. Through two user studies in a large office, we found that compared to non-spatialized audio, the spatialized remote expert's voice and auditory beacons enabled local workers to find small occluded objects with significantly stronger spatial perception. We also found that while the spatial auditory cues could indicate the spatial layout and a general direction to search for the target object, visual head frustum and hand gestures intuitively demonstrated the remote expert's movements and the position of the target. Integrating visual cues (especially the head frustum) with the spatial auditory cues significantly improved the local worker's task performance, social presence, and spatial perception of the environment.
机译:协作混合现实(MR)技术使远程人士能够通过分享面对面对话所在的通信线索,例如眼睛凝视和手势。虽然在许多协作MR系统中调查了视觉线索的作用,但使用空间听觉线索仍然是望而面的。在本文中,我们提出了一个MR远程协作系统,这些系统在合作者之间共享空间听觉和视觉提示,以帮助他们完成搜索任务。通过两个用户学习在一个大型办公室,我们发现与非空间化音频相比,空间化的远程专家的语音和听觉信标使当地工人能够找到小型遮挡物体,其空间感知具有明显更强的空间感知。我们还发现,虽然空间听觉提示可以指示空间布局和用于搜索目标对象的一般方向,但可视头截头和手势直观地证明了远程专家的运动和目标的位置。将视觉提示(尤其是头部截头)与空间听觉线索集成,显着提高了本地工人的任务表现,社会存在和环境空间感知。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号