首页> 外文会议>Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on >UMPM benchmark: A multi-person dataset with synchronized video and motion capture data for evaluation of articulated human motion and interaction
【24h】

UMPM benchmark: A multi-person dataset with synchronized video and motion capture data for evaluation of articulated human motion and interaction

机译:UMPM基准:具有同步视频和运动捕获数据的多人数据集,用于评估人为运动和交互活动

获取原文
获取原文并翻译 | 示例

摘要

Analyzing human motion, including tracking and pose estimation, is a major topic in computer vision. Many methods have been developed in the past and will be developed in the future. To have a systematic and quantitative evaluation of such methods, ground truth data of the 3D human motion is scientifically required. Some publicly available data sets exist, like HumanEva, that provide synchronized video sequences with detailed ground truth 3D data for scenes limited to only a single person. However, for multiple persons, such a data set currently does not exist. In this paper, we present the Utrecht Multi-Person Motion (UMPM) benchmark, which includes synchronized motion capture data and video sequences from multiple viewpoints for multi-person motion including multi-person interaction. The data set is available to the research community to promote research in multi-person articulated human motion analysis. This paper describes the design of the benchmark, the technical problem solutions, and the resulting data sets.
机译:分析人类运动,包括跟踪和姿势估计,是计算机视觉的主要主题。过去已经开发了许多方法,将来还会开发。为了对此类方法进行系统和定量的评估,科学上需要3D人体运动的地面真实数据。存在一些公开可用的数据集,例如HumanEva,它们为仅限于单个人的场景提供同步的视频序列以及详细的地面真实3D数据。但是,对于多人而言,当前不存在这样的数据集。在本文中,我们提出了乌特勒支多人运动(UMPM)基准,该基准包括同步的运动捕获数据和来自多视点的视频序列,以进行多人运动,包括多人交互。该数据集可供研究团体用来促进多人关节运动分析的研究。本文介绍了基准测试的设计,技术问题解决方案以及所得的数据集。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号