首页> 外文期刊>Journal of Field Robotics >Robust Data Fusion of Multimodal Sensory Information for Mobile Robots
【24h】

Robust Data Fusion of Multimodal Sensory Information for Mobile Robots

机译:移动机器人多模式传感信息的鲁棒数据融合

获取原文
获取原文并翻译 | 示例
           

摘要

Urban search and rescue (USAR) missions for mobile robots require reliable state estimation systems resilient to conditions given by the dynamically changing environment. We design and evaluate a data fusion system for localization of a mobile skid-steer robot intended for USAR missions. We exploit a rich sensor suite including both proprioceptive (inertial measurement unit and tracks odometry) and exteroceptive sensors (omnidirectional camera and rotating laser rangefinder). To cope with the specificities of each sensing modality (such as significantly differing sampling frequencies), we introduce a novel fusion scheme based on an extended Kalman filter for six degree of freedom orientation and position estimation. We demonstrate the performance on field tests of more than 4.4 km driven under standard USAR conditions. Part of our datasets include ground truth positioning, indoor with a Vicon motion capture system and outdoor with a Leica theodolite tracker. The overall median accuracy of localization-achieved by combining all four modalities-was 1.2% and 1.4% of the total distance traveled for indoor and outdoor environments, respectively. To identify the true limits of the proposed data fusion, we propose and employ a novel experimental evaluation procedure based on failure case scenarios. In this way, we address the common issues such as slippage, reduced camera field of view, and limited laser rangefinder range, together with moving obstacles spoiling the metric map. We believe such a characterization of the failure cases is a first step toward identifying the behavior of state estimation under such conditions. We release all our datasets to the robotics community for possible benchmarking.
机译:用于移动机器人的城市搜索和救援(USAR)任务需要可靠的状态估计系统,以应对动态变化的环境所提供的条件。我们设计和评估了一种数据融合系统,用于定位用于USAR任务的移动滑移转向机器人。我们开发了一套丰富的传感器套件,包括本体感受(惯性测量单元和跟踪里程表)和外部感知传感器(全向相机和旋转激光测距仪)。为了应对每种传感方式的特殊性(例如,采样频率显着不同),我们针对六自由度方向和位置估计引入了一种基于扩展卡尔曼滤波器的新颖融合方案。我们展示了在标准USAR条件下行驶超过4.4 km的现场测试的性能。我们的部分数据集包括地面真相定位,在室内使用Vicon运动捕捉系统和在室外使用Leica经纬仪跟踪仪。通过结合所有四种方式实现的总体定位精度中位数,分别为室内和室外环境的总行驶距离的1.2%和1.4%。为了确定所提出的数据融合的真正局限性,我们提出并采用了基于失败案例的新颖实验评估程序。通过这种方式,我们解决了常见的问题,例如打滑,缩小的相机视野和有限的激光测距仪范围,以及移动的障碍物破坏了公制地图。我们认为,对故障案例的这种表征是在这种情况下识别状态估计行为的第一步。我们将所有数据集发布给机器人社区,以进行可能的基准测试。

著录项

  • 来源
    《Journal of Field Robotics》 |2015年第4期|447-473|共27页
  • 作者单位

    Center for Machine Perception, Dept. of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Technicka 2,166 27, Prague 6, Czech Republic;

    ETH Zurich, Tannenstrasse 3,8092, Zurich, Switzerland;

    ETH Zurich, Tannenstrasse 3,8092, Zurich, Switzerland;

    ETH Zurich, Tannenstrasse 3,8092, Zurich, Switzerland;

    Center for Machine Perception, Dept. of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Technicka 2,166 27, Prague 6, Czech Republic;

    Center for Machine Perception, Dept. of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Technicka 2,166 27, Prague 6, Czech Republic;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号