首页> 外文会议>IEEE Conference on Applications of Computer Vision >Non-rigid Articulated Point Set Registration for Human Pose Estimation
【24h】

Non-rigid Articulated Point Set Registration for Human Pose Estimation

机译:用于人体姿势估计的非刚性关节点集注册

获取原文

摘要

We propose a new non-rigid articulated point set registration framework for human pose estimation that aims at improving two recent registration techniques and filling the gap between the two. One is Coherent Point Drift (CPD) that is a powerful Gaussian Mixture Model (GMM)-based non-rigid registration method, but may not be suitable for articulated deformations due to the violation of motion coherence assumption. The other is articulated ICP (AICP) that is effective for human pose estimation but prone to be trapped in local minima without good correspondence initialization. To bridge the gap of the two, a new non-rigid registration method, called Global-Local Topology Preservation (GLTP), is proposed by integrating a Local Linear Embedding (LLE) -based topology constraint with CPD in a GMM-based formulation, which accommodates articulated non-rigid deformations and provides reliable correspondence estimation for AICP initialization. The experiments on both 3D scan data and depth images demonstrate the effectiveness of the proposed framework.
机译:我们提出了一种新的非刚性关节点设置用于人类姿势估计的注册框架,其旨在改善最近的两个登记技术并填补两者之间的间隙。一个是相干点漂移(CPD),是一种强大的高斯混合模型(GMM)的非刚性登记方法,但由于违反运动连贯假设而不适合铰接变形。另一个是铰接式的ICP(AICP),对人类姿势估计有效,但在没有良好的对应初始化的情况下,容易被困在局部最小值中。为了弥合两者的差距,通过将局部线性嵌入(LLE)基于基于GMM的配方中的CPD集成了局部线性嵌入(LLE)的拓扑约束,提出了一种名为全局局部拓扑保存(GLTP)的新的非刚性登记方法。这适用于铰接式的非刚性变形,并为AICP初始化提供可靠的对应估计。 3D扫描数据和深度图像的实验证明了所提出的框架的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号