首页> 外文会议>IEEE/SICE International Symposium on System Integration >KMOP-vSLAM: Dynamic Visual SLAM for RGB-D Cameras using K-means and OpenPose
【24h】

KMOP-vSLAM: Dynamic Visual SLAM for RGB-D Cameras using K-means and OpenPose

机译:kmop-vslam:使用k均值和调整的RGB-D相机的动态Visual Slam

获取原文

摘要

Although tremendous progress has been made in Simultaneous Localization and Mapping (SLAM), the scene rigidity assumption limits wide usage of visual SLAMs in the real-world environment of computer vision, smart robotics and augmented reality. To make SLAM more robust in dynamic environments, outliers on the dynamic objects, including unknown objects, need to be removed from tracking process. To address this challenge, we present a novel real-time visual SLAM system, KMOP-vSLAM, which adds the capability of unsupervised learning segmentation and human detection to reduce the drift error of tracking in indoor dynamic environments. An efficient geometric outlier detection method is proposed, using dynamic information of the previous frames as well as a novel probability model to judge moving objects with the help of geometric constraints and human detection. Outlier features belonging to moving objects are largely detected and removed from tracking. The well-known dataset, TUM, is used to evaluate tracking errors in dynamic scenes where people are walking around. Our approach yields a significantly lower trajectory error compared to state-of-the-art visual SLAMs using an RGB-D camera.
机译:尽管在同时定位和映射(SLAM)中取得了巨大进展,但场景刚性假设限制了视觉猛击在计算机视觉,智能机器人和增强现实的真实环境中的视觉猛击。为了使SLAM在动态环境中更强大,需要从跟踪过程中删除动态对象(包括未知对象)上的异常值。为了解决这一挑战,我们提出了一部小型实时视觉SLAM系统,kmop-vslam,它增加了无监督的学习分割和人机检测的能力,以减少室内动态环境中跟踪的漂移误差。提出了一种有效的几何异常检测方法,使用先前帧的动态信息以及借助几何约束和人类检测来判断移动物体的新概率模型。属于移动对象的异常特征在很大程度上被检测并从跟踪中删除。众所周知的DataSet Tum,用于评估人们走遍的动态场景中的跟踪错误。与使用RGB-D相机相比,我们的方法产生了明显较低的轨迹误差。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号