首页> 外文会议>IEEE International Conference on Computer Vision;ICCV 2009 >Robust matching of building facades under large viewpoint changes
【24h】

Robust matching of building facades under large viewpoint changes

机译:大视点变化下建筑立面的鲁棒匹配

获取原文

摘要

This paper presents a novel approach to finding point correspondences between images of building facades with wide viewpoint variations, and at the same time returning a large list of true matches between the images. Such images comprise repetitive and symmetric patterns, which render popular algorithms e.g., SIFT to be ineffective. Feature descriptors such as SIFT that are based on region patches are also unstable under large viewing angle variations. In this paper, we integrate both the appearance and geometric properties of an image to find unique matches. First we extract hypotheses of building facades based on a robust line fitting algorithm. Each hypothesis is defined by a planar convex quadrilateral in the image, which we call a “q-region”, and the four corners of each q-region provide the inputs from which a projective transformation model is derived. Next, a set of interest points are extracted from the images and are used to evaluate the correctness of the transformation model. The transformation model with the largest set of matched interest points is selected as the correct model, and this model also returns the best pair of corresponding q-regions and the most number of point correspondences in the two images. Extensive experimental results demonstrate the robustness of our approach in which we achieve a tenfold increase in true matches when compared to state of the art techniques such as SIFT and MSER.
机译:本文提出了一种新颖的方法来查找具有广泛视点变化的建筑立面图像之间的点对应关系,同时返回图像之间的大量真实匹配项。这样的图像包括重复的和对称的图案,这使得流行的算法例如SIFT无效。在大视角变化下,基于区域补丁的特征描述符(如SIFT)也是不稳定的。在本文中,我们整合了图像的外观和几何属性以找到唯一的匹配项。首先,我们基于鲁棒的线拟合算法提取建筑物立面的假设。每个假设都由图像中的平面凸四边形定义,我们称其为“ q区域”,每个q区域的四个角为输入提供了投影转换模型,这些投影是从中得出的。接下来,从图像中提取一组兴趣点,并将其用于评估转换模型的正确性。选择具有最大匹配兴趣点集的变换模型作为正确模型,并且该模型还返回两个图像中对应的q区域的最佳对和最多的点对应数。大量的实验结果表明,与诸如SIFT和MSER之类的最新技术相比,我们的方法在真实匹配方面提高了十倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号