首页> 外文OA文献 >Bearing-only SLAM : a vision-based navigation system for autonomous robots
【2h】

Bearing-only SLAM : a vision-based navigation system for autonomous robots

机译:纯轴承SLAM:用于自主机器人的基于视觉的导航系统

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
机译:为了在先前未开发的环境中成功导航,移动机器人必须能够准确估计感兴趣对象的空间关系。同步定位和制图(SLAM)系统利用其传感器来逐步构建周围环境的地图,并同时在地图上进行本地定位。该研究项目的目的是开发一种适用于自行式家用割草机的SLAM系统。提议的纯轴承SLAM系统仅需要一台全能摄像机和一些便宜的地标。全向摄像机的主要优点是可以看到场景中所有地标的全景。与安装现有的自动割草机所需的周边电线相比,将地标放置在草坪中以定义工作区域要容易得多,也更加灵活。现有的仅使用轴承的SLAM方法的常用方法依赖于用于预测机器人姿态的运动模型和用于更新姿态的传感器模型。在运动模型中,主要由于车轮打滑而累积了估计物体位置的误差。准确量化对象位置的不确定性是一项基本要求。在仅轴承的SLAM中,界标位置的概率密度函数(PDF)沿观察到的轴承应该是均匀的。现有的用高斯估计近似PDF的方法不能满足这种一致性要求。本文介绍了几何方法和概率方法来解决上述问题。本论文的主要新颖贡献是:1.无需里程表的仅轴承SLAM方法。所提出的方法仅依赖于传感器模型(仅具有地标性的轴承),而不依赖于运动模型(测距法)。估计地标位置的不确定性仅取决于视觉误差,而不是里程计和视觉误差的组合。 2.物体空间不确定性的转变。本论文介绍了一种新颖的方法,可以将从附着在机器人上的移动框架估计的对象的空间不确定性转换为附着在环境中的静态地标的全局框架。 3.改进的PDF的表征,用于表示纯轴承SLAM中的界标位置。拟议的PDF用极坐标表示,范围上的边际概率被约束为均匀。与根据混合高斯估计的PDF相比,此处开发的PDF参数要少得多,并且可以很容易地在概率框架中采用,例如粒子过滤系统。我们提出的纯轴承SLAM系统的主要优点是其较低的生产成本和使用灵活性。当地形基本上为2D时,建议的系统也可以在其他家用机器人中使用,例如吸尘器或机器人玩具。

著录项

  • 作者

    Huang Henry;

  • 作者单位
  • 年度 2008
  • 总页数
  • 原文格式 PDF
  • 正文语种 {"code":"en","name":"English","id":9}
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号