【24h】

Error Aware Monocular Visual Odometry Using Vertical Line Pairs for Small Robots in Urban Areas

机译:使用垂直线对的误差感知单眼视觉里程表,用于城市小型机器人

获取原文

摘要

We report a new error-aware monocular visual odometry method that only uses vertical lines, such as vertical edges of buildings and poles in urban areas as landmarks. Since vertical lines are easy to extract, insensitive to lighting conditions/shadows, and sensitive to robot movements on the ground plane, they are robust features if compared with regular point features or line features. We derive a recursive visual odometry method based on the vertical line pairs. We analyze how errors are propagated and introduced in the continuous odometry process by deriving the closed form representation of covariance matrix. We formulate the minimum variance ego-motion estimation problem and present a method that outputs weights for different vertical line pairs. The resulting visual odometry method is tested in physical experiments and compared with two existing methods that are based on point features and line features, respectively. The experiment results show that our method outperforms its two counterparts in robustness, accuracy, and speed. The relative errors of our method are less than 2% in experiments.
机译:我们报告了一种新的具有错误意识的单眼视觉测距方法,该方法仅使用垂直线(例如建筑物的垂直边缘和市区的电线杆)作为地标。由于垂直线易于提取,对光照条件/阴影不敏感,并且对机器人在地平面上的移动不敏感,因此与常规点要素或线要素相比,它们是可靠的要素。我们基于垂直线对推导了递归视觉测距法。我们通过推导协方差矩阵的闭合形式表示来分析误差在连续里程计过程中的传播和引入方式。我们制定了最小方差自我运动估计问题,并提出了一种输出不同垂直线对权重的方法。所得的视觉里程计方法已在物理实验中进行了测试,并与两种分别基于点特征和线特征的现有方法进行了比较。实验结果表明,该方法在鲁棒性,准确性和速度上均优于其两种方法。实验中我们方法的相对误差小于2%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号