首页> 外文会议>IASTED international conference on Signal processing, pattern recognition, and applications >A COMPARISON OF ERROR METRICS FOR EXTRINSIC CALIBRATION AND FUSION OF CAMERA AND MULTI-LAYER LIDAR
【24h】

A COMPARISON OF ERROR METRICS FOR EXTRINSIC CALIBRATION AND FUSION OF CAMERA AND MULTI-LAYER LIDAR

机译:相机外在校准和融合的误差指标比较和多层潮汐

获取原文

摘要

This work develops and compares different error metrics for calibration of a 3D geometric mapping between multilayer LIDAR point clouds and video images.Most approaches to calibration are based on minimizing the distance between pairs of corresponding points found in both LIDAR and vision data by adjusting the extrinsic calibration parameters. Such a metric does not respect the complementary information provided by the two sensors: a point's depth is more precisely perceived by the LIDAR whereas the direction to the point is better resolved by the camera.Taking this into account, this work investigates the properties of different error metrics, each considering the diverse sensor characteristics to a different degree. Calibration results are demonstrated in simulation experiments and on real world data taken on the autonomous ground vehicle MuCAR-3 while moving in traffic.
机译:该工作开发并比较了多层LIDAR点云和视频图像之间的3D几何映射的校准不同的错误指标。尽量最小化LIDAR和视觉数据中发现的对应点对之间的距离,通过调整外部校准参数。这种度量不尊重两个传感器提供的互补信息:LIDAR更精确地感知到点的深度,而该点的方向是由相机更好地解决。考虑到这一点,这项工作调查了不同的属性错误指标,每个都考虑不同程度的传感器特性。校准结果在仿真实验中和现实世界数据上展示了在交通中自主地面车辆粘膜3上拍摄的现实世界数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号