首页> 外文会议>ACM Conference on Human Factors in Computing Systems >VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval
【24h】

VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval

机译:VirtualGRASP:利用与物理对象交互以促进数字对象检索的经验

获取原文
获取外文期刊封面目录资料

摘要

We propose VirtualGrasp, a novel gestural approach to retrieve virtual objects in virtual reality. Using VirtualGrasp, a user retrieves an object by performing a barehanded gesture as if grasping its physical counterpart. The object-gesture mapping under this metaphor is of high intuitiveness, which enables users to easily discover, remember the gestures to retrieve the objects. We conducted three user studies to demonstrate the feasibility and effectiveness of the approach. Progressively, we investigated the consensus of the object-gesture mapping across users, the expressivity of grasping gestures, and the learnability and performance of the approach. Results showed that users achieved high agreement on the mapping, with an average agreement score [35] of 0.68 (SD=0.27). Without exposure to the gestures, users successfully retrieved 76% objects with VirtualGrasp. A week after learning the mapping, they could recall the gestures for 93% objects.
机译:我们提出VirtureGrasp,一种新颖的手势方法来检索虚拟现实中的虚拟对象。 使用VirtualGRASP,用户通过执行无留言手势来检索对象,仿佛掌握其物理对应物。 此隐喻下的对象手势映射是高直观的,这使用户能够轻松发现,记住要检索对象的手势。 我们进行了三项用户研究,以证明方法的可行性和有效性。 逐步地调查了对用户跨越用户的对象手势映射的共识,掌握手势的表现,以及方法的可读性和性能。 结果表明,用户在映射上实现了高协议,平均协议得分[35]为0.68(SD = 0.27)。 没有曝光手势,用户成功地检索了VirtureGRASP的76%对象。 学习映射后一周,他们可以回想起93%对象的手势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号