首页> 中文期刊> 《光学精密工程》 >基于旋转投影二进制描述符的空间目标位姿估计

基于旋转投影二进制描述符的空间目标位姿估计

         

摘要

To estimate quickly the relative pose of a space target based on point cloud , a Binary Rotational Projection Histogram (BRoPH ) feature descriptor was proposed . Firstly , a Local Reference Frame (LRF) for the feature point was established ;Then ,the density and depth images were generated under different views by rotationally projecting the local surface of feature point , Finally ,the multi-scale binary string of the feature point was produced based on the images .For implementing the pose estimation of space target in real time ,a Hamming distance threshold based feature matching strategy was proposed further to exclude false matching pairs to accelerate coarse pose estimation procedure .The comparison experiments were performed with SHOT descriptor and FPFH descriptor .The results demonstrate that BRoPH achieves an accurate pose estimation with only about 1/80 average memory cost of SHOT and FPFH descriptors .The average attitude error of BRoPH is under 0 .1 ° ,and its average translation error is less than 1/180R .Besides ,the Hamming distance threshold based feature matching strategy speeds up the subsequent RANSAC by 7 times , and the overall pose estimation frequency exceeds 7 Hz ,w hich is 3 to 6 .8 times faster than those of SHOT and FPFH descriptors respectively .The proposed feature descriptor is compact and efficient , and the pose estimation method is accurate and robust for the requirements of space target pose estimation .%为了实现基于点云的空间目标相对位姿快速估计,提出一种旋转投影二进制描述符(BRoPH).该描述符首先建立特征点处的局部参考坐标系,然后通过旋转投影局部点云生成不同视角下的密度图像块和深度图像块,最后根据图像块生成特征点的多尺度二进制字符串.针对位姿估计对实时性的要求,在分析BRoPH Hamming距离分布的基础上,提出了基于Hamming距离阈值的特征匹配策略,用于剔除潜在的错误配对,加快位姿估计收敛速度.最后,在基于局部特征描述符位姿估计框架下分别与SHOT描述符和FPFH描述符进行了比较.结果表明:BRoPH描述符在仅需要SHOT和FPFH平均内存1/80的基础上,得到了远高于SHOT和FPFH的平均位姿估计精度,其平均姿态误差小于0.1°,平均位置误差小于1/180 R.此外,基于Hamming距离阈值的特征匹配策略使得BRoPH的位姿粗估计速度加快了7倍,总体位姿估计频率超过7 Hz,比SHOT和FPFH分别快3~6.8倍.该方法具有占用内存小、计算速度快、位姿估计精度高和抗干扰能力强等优点,满足基于点云的空间目标位姿估计实时性要求.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号