首页> 外文期刊>ACM Transactions on Graphics >iMapper: Interaction-guided Scene Mapping from Monocular Videos
【24h】

iMapper: Interaction-guided Scene Mapping from Monocular Videos

机译:iMapper:单目视频中的交互指导场景映射

获取原文
获取原文并翻译 | 示例

摘要

Next generation smart and augmented reality systems demand a computational understanding of monocular footage that captures humans in physical spaces to reveal plausible object arrangements and human-object interactions. Despite recent advances, both in scene layout and human motion analysis, the above setting remains challenging to analyze due to regular occlusions that occur between objects and human motions. We observe that the interaction between object arrangements and human actions is often strongly correlated, and hence can be used to help recover from these occlusions. We present iMAPPER, a data-driven method to identify such human-object interactions and utilize them to infer layouts of occluded objects. Starting from a monocular video with detected 2D human joint positions that are potentially noisy and occluded, we first introduce the notion of interaction-saliency as space-time snapshots where informative human-object interactions happen. Then, we propose a global optimization to retrieve and fit interactions from a database to the detected salient interactions in order to best explain the input video. We extensively evaluate the approach, both quantitatively against manually annotated ground truth and through a user study, and demonstrate that iMAPPER produces plausible scene layouts for scenes with medium to heavy occlusion. Code and data are available on the project page.
机译:下一代智能和增强现实系统需要对单眼素材进行计算理解,以捕捉物理空间中的人类,以揭示可能的物体排列和人与物体之间的相互作用。尽管在场景布局和人体运动分析方面都取得了新的进展,但是由于物体和人体运动之间经常发生遮挡,上述设置仍然难以分析。我们观察到,对象布置与人类行为之间的相互作用通常密切相关,因此可以用来帮助从这些遮挡中恢复。我们提出了iMAPPER,这是一种数据驱动的方法,用于识别这种人与物体之间的交互,并利用它们来推断被遮挡对象的布局。从具有检测到的潜在嘈杂和被遮挡的2D人体关节位置的单眼视频开始,我们首先引入交互显着性的概念,即发生信息性人与物体交互作用的时空快照。然后,我们提出一种全局优化,以检索和拟合数据库中的交互并将其与检测到的显着交互进行拟合,以便最好地解释输入视频。我们针对手动注释的地面真相并通过用户研究对方法进行了广泛的评估,并证明iMAPPER可以为中度至重度遮挡的场景提供合理的场景布局。代码和数据可在项目页面上找到。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号