首页> 外文会议>IEEE International Conference on Multimedia and Expo >Efficient Online Label Consistent Hashing for Large-Scale Cross-Modal Retrieval
【24h】

Efficient Online Label Consistent Hashing for Large-Scale Cross-Modal Retrieval

机译:高效的在线标签一致散列大规模交叉模态检索

获取原文

摘要

Existing cross-modal hashing still faces three challenges: (1) Most batch-based methods are unsuitable for processing large-scale and streaming data. (2) Current online methods often suffer from insufficient semantic association, while lacking flexibility to learn the hash functions for varying streaming data. (3) Existing supervised methods always require much computation time or accumulate large quantization loss to learn hash codes. To address above challenges, we present an efficient Online Label Consistent Hashing (OLCH) for cross-modal retrieval, which aims to incrementally learn hash codes for the current arriving data, while updating the hash functions at a streaming manner. To be specific, an on-line semantic representation learning framework is designed to adaptively preserve the semantic similarity across different modalities, and a mini-batch online gradient descent approach associated with forward-backward splitting is developed to optimize the hash functions. Accordingly, the hash codes are adaptively learned online with the high discriminative capability, while avoiding high computation complexity to process the streaming data. Experimental results show its outstanding performance in comparison with the-state-of-arts.
机译:现有的跨模型散列仍面临三个挑战:(1)大多数基于批处理的方法不适合处理大规模和流式数据。 (2)当前的在线方法经常遭受语义关联不足,同时缺乏学习散列函数的灵活性,以改变流数据。 (3)现有的监督方法始终需要大量计算时间或累积大量化损耗以学习哈希代码。为了解决上述挑战,我们呈现了一个有效的在线标签一致散列(OLCH),用于跨模型检索,旨在逐步学习当前到达数据的哈希代码,同时以流式方式更新散列函数。具体而言,在线语义表示学习框架旨在自适应地保持不同模式的语义相似性,并且开发了与前后分离相关联的迷你批次在线梯度下降方法以优化散列函数。因此,散列码在线在线在线学习,具有高鉴别能力,同时避免高计算复杂性以处理流传输数据。实验结果表明,与最先进的效果相比,其出色的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号