首页> 外文会议>International Conference on Intelligent Robots and Systems >Efficient Compositional Approaches for Real-Time Robust Direct Visual Odometry from RGB-D Data
【24h】

Efficient Compositional Approaches for Real-Time Robust Direct Visual Odometry from RGB-D Data

机译:来自RGB-D数据的实时强大直接视觉测量的高效组成方法

获取原文

摘要

In this paper we give an evaluation of different methods for computing frame-to-frame motion estimates for a moving RGB-D sensor, by means of aligning two images using photometric error minimization. These kind of algorithms have recently shown to be very accurate and robust and therefore provide an attractive solution for robot ego-motion estimation and navigation. We demonstrate three different alignment strategies, namely the Forward-Compositional, the Inverse-Compositional and the Efficient Second-Order Minimization approach, in a general robust estimation framework. We further show how estimating global affine illumination changes, in general improves the performance of the algorithms. We compare our results with recently published work, considered as state-of-the art in this field, and show that our solutions are in general more precise and can perform in real-time on standard hardware.
机译:在本文中,我们通过使用光度误差最小化对准两个图像来评估用于移动RGB-D传感器的不同方法用于移动RGB-D传感器。这些算法最近被证明是非常准确和坚固的,因此为机器人EGO-Motion估计和导航提供了一种有吸引力的解决方案。我们在一般稳健的估计框架中展示了三种不同的对准策略,即前向组成,反相组成和有效的二阶最小化方法。我们进一步展示了如何估算全球仿射照明的变化,一般提高了算法的性能。我们将结果与最近发布的工作进行比较,被视为本领域的最先进的工作,并表明我们的解决方案一般更精确,可以实时执行标准硬件。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号