首页> 外文会议>European conference on computer vision >Relative Pose Estimation and Fusion of Omnidirectional and Lidar Cameras
【24h】

Relative Pose Estimation and Fusion of Omnidirectional and Lidar Cameras

机译:全向和LIDAR相机的相对姿态估计与融合

获取原文

摘要

This paper presents a novel approach for the extrinsic parameter estimation of omnidirectional cameras with respect to a 3D Lidar coordinate frame. The method works without specific setup and calibration targets, using only a pair of 2D-3D data. Pose estimation is formulated as a 2D-3D nonlinear shape registration task which is solved without point correspondences or complex similarity metrics. It relies on a set of corresponding regions, and pose parameters are obtained by solving a small system of nonlinear equations. The efficiency and robustness of the proposed method was confirmed on both synthetic and real data in urban environment.
机译:本文介绍了一种关于关于3D LIDAR坐标框的全向相机的外在参数估计的新方法。该方法使用一对2D-3D数据在没有特定设置和校准目标的情况下工作。构造姿势估计作为2D-3D非线性形状注册任务,该任务在没有点对应关系或复杂的相似度度量的情况下解决。它依赖于一组相应的区域,并且通过求解一个小型非线性方程来获得姿势参数。建议方法的效率和稳健性在城市环境中的合成和实际数据上确认。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号