首页> 外文会议>Pacific-Rim conference on multimedia >Hypergraph-Based Discrete Hashing Learning for Cross-Modal Retrieval
【24h】

Hypergraph-Based Discrete Hashing Learning for Cross-Modal Retrieval

机译:基于超图的离散哈希学习用于跨模态检索

获取原文

摘要

Hashing has drawn increasing attention in cross-modal retrieval due to its high computation efficiency and low storage cost. However, there is a certain lack in the previous cross-modal hashing methods that they can not effectively represent the correlations between paired multi-modal instances. In this paper, we propose a novel Hypergraph-based Discrete Hashing (BGDH) to solve the limitation. We formulate a unified unsupervised hashing framework which simultaneously performs hypergraph learning and hash codes learning. Hypergraph learning can effectively preserve the intra-media similarity consistency. Furthermore, we propose an efficient discrete hash optimization method to directly learn the hash codes without quantization information loss. Extensive experiments on three benchmark datasets demonstrate the superior performance of the proposed approach, compared with state-of-the-art cross-modal hashing techniques.
机译:散列因其高计算效率和低存储成本而在交叉模式检索中引起了越来越多的关注。但是,以前的跨模式散列方法存在一定的不足,即它们不能有效地表示配对的多模式实例之间的相关性。在本文中,我们提出了一种新颖的基于超图的离散哈希(BGDH)来解决该限制。我们制定了一个统一的无监督哈希框架,该框架同时执行超图学习和哈希码学习。超图学习可以有效地保持媒体内相似度的一致性。此外,我们提出了一种有效的离散哈希优化方法,可以直接学习哈希码而不会丢失量化信息。与最新的交叉模式哈希技术相比,在三个基准数据集上进行的广泛实验证明了该方法的优越性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号