首页> 外文期刊>IEEE Transactions on Geoscience and Remote Sensing >Improving Sensor Fusion: A Parametric Method for the Geometric Coalignment of Airborne Hyperspectral and Lidar Data
【24h】

Improving Sensor Fusion: A Parametric Method for the Geometric Coalignment of Airborne Hyperspectral and Lidar Data

机译:改善传感器融合:机载高光谱和激光雷达数据几何共线的参数方法

获取原文
获取原文并翻译 | 示例
           

摘要

Synergistic applications based on integrated hyperspectral and lidar data are receiving a growing interest from the remote-sensing community. A prerequisite for the optimum sensor fusion of hyperspectral and lidar data is an accurate geometric coalignment. The simple unadjusted integration of lidar elevation and hyperspectral reflectance causes a substantial loss of information and does not exploit the full potential of both sensors. This paper presents a novel approach for the geometric coalignment of hyperspectral and lidar airborne data, based on their respective adopted return intensity information. The complete approach incorporates ray tracing and subpixel procedures in order to overcome grid inherent discretization. It aims at the correction of extrinsic and intrinsic (camera resectioning) parameters of the hyperspectral sensor. In additional to a tie-point-based coregistration, we introduce a ray-tracing-based back projection of the lidar intensities for area-based cost aggregation. The approach consists of three processing steps. First is a coarse automatic tie-point-based boresight alignment. The second step coregisters the hyperspectral data to the lidar intensities. Third is a parametric coalignment refinement with an area-based cost aggregation. This hybrid approach of combining tie-point features and area-based cost aggregation methods for the parametric coregistration of hyperspectral intensity values to their corresponding lidar intensities results in a root-mean-square error of 1/3 pixel. It indicates that a highly integrated and stringent combination of different coalignment methods leads to an improvement of the multisensor coregistration.
机译:基于集成的高光谱和激光雷达数据的协同应用越来越受到遥感界的关注。高光谱和激光雷达数据最佳传感器融合的前提是精确的几何共对准。激光雷达高度和高光谱反射率的简单未经调整的集成会导致大量信息丢失,并且无法充分利用两个传感器的全部潜能。本文基于高光谱和激光雷达机载数据各自采用的返回强度信息,提出了一种新颖的方法来对高光谱和激光雷达的机载数据进行几何对齐。完整的方法结合了光线追踪和子像素过程,以克服网格固有的离散化。它旨在校正高光谱传感器的外部和固有(相机切除)参数。除了基于联系点的综合外,我们还介绍了基于射线追踪的激光雷达强度​​的反向投影,用于基于面积的成本汇总。该方法包括三个处理步骤。首先是基于自动联络点的粗视轴对准。第二步将高光谱数据共配准到激光雷达强度​​。第三是基于区域成本汇总的参数协同调整。这种将联系点特征和基于区域的成本汇总方法相结合的混合方法,用于将高光谱强度值的参数集中化到其对应的激光雷达强度​​,会导致1/3像素的均方根误差。这表明,不同共对准方法的高度集成和严格的组合可以改善多传感器的配准。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号