首页> 外文会议>International Conference on Spatial Cognition >Spatial References with Gaze and Pointing in Shared Space of Humans and Robots
【24h】

Spatial References with Gaze and Pointing in Shared Space of Humans and Robots

机译:凝视的空间参考文献和指向人类和机器人共享空间

获取原文

摘要

For solving tasks cooperatively in close interaction with humans, robots need to have timely updated spatial representations. However, perceptual information about the current position of interaction partners is often late. If robots could anticipate the targets of upcoming manual actions, such as pointing gestures, they would have more time to physically react to human movements and could consider prospective space allocations in their planning. Many findings support a close eye-hand coordination in humans which could be used to predict gestures by observing eye gaze. However, effects vary strongly with the context of the interaction. We collect evidence of eye-hand coordination in a natural route planning scenario in which two agents interact over a map on a table. In particular, we are interested if fixations can predict pointing targets and how target distances affect the interlocutor's pointing behavior. We present an automatic method combining marker tracking and 3D modeling that provides eye and gesture measurements in real-time.
机译:为了解决与人类密切互动的协同任务,机器人需要及时更新的空间表示。然而,关于互动伙伴当前位置的感知信息通常很晚。如果机器人可以预测即将到来的手动行动的目标,例如指向手势,他们将有更多的时间来对人类运动进行物理作用,并可以考虑计划计划的预期空间分配。许多发现支持人类的密切关注,可以通过观察眼睛凝视来预测姿势。然而,效果随着互动的背景而强烈变化。我们在自然路线规划场景中收集了眼睛交易协调的证据,其中两个代理在桌子上的地图上交互。特别是,如果固定可以预测指向目标以及目标距离如何影响对话者的指向行为,我们会感兴趣。我们提出了一种自动方法,组合标记跟踪和3D建模,其实时提供眼睛和手势测量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号