首页> 外文期刊>ISPRS journal of photogrammetry and remote sensing >Context-enhanced motion coherence modeling for global outlier rejection
【24h】

Context-enhanced motion coherence modeling for global outlier rejection

机译:Context-enhanced motion coherence modeling for global outlier rejection

获取原文
获取原文并翻译 | 示例
       

摘要

? 2023 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS)Feature matching is an essential step in a wide range of photogrammetry and computer vision tasks but limited by the ambiguities of local descriptors. Hence, numerous false feature matches (outliers) will inevitably be generated, especially in complex scenarios. Motion coherence can establish the statistical relationships among sparse motions to remove outliers, assuming that the true matches are coherent, and the false matches are randomly scattered. However, existing methods model motion coherence either in a local spatial context or without considering rich matching priors, thereby leading to numerous matching failures when outlier rates are high. In this study, we propose a context-enhanced motion coherence modeling (CoMo) method to distinguish consistent correct motions from erroneous matches for robust outlier rejection. The CoMo method deploys a consistency-aware motion descriptor to encode the consistency-related matching priors of feature matches as a high-dimensional representation, which can provide a rich context for differentiating heterogeneous motions. Based on this discriminative descriptor, we further introduce deformable affine transformation (DAT) as a proxy for motion and fit the coherent motions from the candidate matches with a globally smooth function under the truncated least squares estimation framework. Extensive experiments on multiple large datasets (including the Image Matching Challenge at CVPR 2020) demonstrate that the CoMo method can effectively model motion coherence from noisy candidate matches and outperform other state-of-the-art methods in outlier rejection and relative camera pose estimation. The code is available at https://github.com/geovsion/CoMo.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号