首页> 外文会议> >Binary Constrained Deep Hashing Network for Image Retrieval Without Manual Annotation
【24h】

Binary Constrained Deep Hashing Network for Image Retrieval Without Manual Annotation

机译:无需人工注释的二进制约束深度哈希网络用于图像检索

获取原文

摘要

Learning compact binary codes for image retrieval task using deep neural networks has attracted increasing attention recently. However, training deep hashing networks for the task is challenging due to the binary constraints on the hash codes, the similarity preserving property, and the requirement for a vast amount of labelled images. To the best of our knowledge, none of the existing methods has tackled all of these challenges completely in a unified framework. In this work, we propose a novel end-to-end deep learning approach for the task, in which the network is trained to produce binary codes directly from image pixels without the need o f manual annotation. In particular, to deal with the non-smoothness of binary constraints, we propose a novel pairwise constrained loss function, which simultaneously encodes the distances between pairs of hash codes, and the binary quantization error. In order to train the network with the proposed loss function, we propose an efficient parameter learning algorithm. In addition, to provide similar / dissimilar training images to train the network, we exploit 3D models reconstructed from unlabelled images for automatic generation of enormous training image pairs. The extensive experiments on image retrieval benchmark datasets demonstrate the improvements of the proposed method over the state-of-the-art compact representation methods on the image retrieval problem.
机译:使用深度神经网络为图像检索任务学习紧凑的二进制代码近来引起了越来越多的关注。但是,由于对哈希码的二进制限制,相似性保留属性以及对大量标记图像的需求,为该任务训练深度哈希网络颇具挑战性。据我们所知,没有一个现有的方法可以在一个统一的框架中完全解决所有这些挑战。在这项工作中,我们为任务提出了一种新颖的端到端深度学习方法,其中训练网络直接从图像像素生成二进制代码,而无需人工注释。特别地,为了处理二进制约束的非平滑性,我们提出了一种新颖的成对约束损失函数,该函数同时对哈希码对之间的距离和二进制量化误差进行编码。为了用提出的损失函数训练网络,我们提出了一种有效的参数学习算法。此外,为了提供相似/不同的训练图像来训练网络,我们利用从未标记图像重建的3D模型来自动生成巨大的训练图像对。在图像检索基准数据集上进行的广泛实验证明,该方法相对于有关图像检索问题的最新紧凑表示方法而言,是一种改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号