...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >Unsupervised hashing based on the recovery of subspace structures
【24h】

Unsupervised hashing based on the recovery of subspace structures

机译:基于子空间结构的恢复,无监督散列

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Unsupervised semantic hashing should in principle keep the semantics among samples consistent with the intrinsic geometric structures of the dataset. In this paper, we propose a novel multiple stage unsupervised hashing method, named "Unsupervised Hashing based on the Recovery of Subspace Structures" (RSSH) for image retrieval. Specifically, we firstly adapt the Low-rank Representation (LRR) model into a new variant which treats the real-world data as samples drawn from a union of several low-rank subspaces. Then, the pairwise similarities are represented in a space-and-time saving manner based on the learned low-rank correlation matrix of the modified LRR. Next, the challenging discrete graph hashing is employed for binary hashing codes. Notably, we convert the original graph hashing model into an optimization-friendly formalization, which is addressed with efficient closed-form solutions for its sub-problems. Finally, the devised linear hash functions are fast achieved for out-of-samples. Retrieval experiments on four image datasets testify the superiority of RSSH to several state-of-the-art hashing models. Besides, it's worth mentioning that RSSH, a shallow model, significantly outperforms two recently proposed unsupervised deep hashing methods, which further confirms its effectiveness. (C) 2020 Elsevier Ltd. All rights reserved.
机译:无人监督的语义散列原则上应保持与数据集的内在几何结构一致的样本之间的语义。在本文中,我们提出了一种新颖的多阶段无监督散列方法,名为“无监督的散列,基于子空间结构的恢复”(RSSH)进行图像检索。具体地,我们首先将低秩表示(LRR)模型调整为一个新的变体,该变体将真实世界数据视为从几个低级子空间的联合汲取的样本。然后,基于所学习的LRR的学习的低秩相关矩阵以空节节省的方式表示成对相似度。接下来,用于二进制散列码采用具有挑战性的离散图散列。值得注意的是,我们将原始图形散列模型转换为优化友好的形式化,这些正式化是以有效的闭合解决方案解决其子问题的解决方案。最后,为异样品进行了快速实现了设计的线性散列函数。在四个图像数据集上检索实验证明了RSSH的优越性到几种最先进的散列模型。此外,值得一提的是,RSSH,浅模型,显着优于两种最近提出的无监督的深度散列方法,这进一步证实了其有效性。 (c)2020 elestvier有限公司保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号