...
首页> 外文期刊>IEEE transactions on multimedia >Online Fast Adaptive Low-Rank Similarity Learning for Cross-Modal Retrieval
【24h】

Online Fast Adaptive Low-Rank Similarity Learning for Cross-Modal Retrieval

机译:在线快速自适应低级相似性学习,用于交叉模态检索

获取原文
获取原文并翻译 | 示例

摘要

The semantic similarity among cross-modal data objects, e.g., similarities between images and texts, are recognized as the bottleneck of cross-modal retrieval. However, existing batch-style correlation learning methods suffer from prohibitive time complexity and extra memory consumption in handling large-scale high dimensional cross-modal data. In this paper, we propose a Cross-Modal Online Low-Rank Similarity function learning (CMOLRS) method, which learns a low-rank bilinear similarity measurement for cross-modal retrieval. We model the cross-modal relations by relative similarities on the training data triplets and formulate the relative relations as convex hinge loss. By adapting the margin in hinge loss with pair-wise distances in feature space and label space, CMOLRS effectively captures the multi-level semantic correlation and adapts to the content divergence among cross-modal data. Imposed with a low-rank constraint, the similarity function is trained by online learning in the manifold of low-rank matrices. The low-rank constraint not only endows the model learning process with faster speed and better scalability, but also improves the model generality. We further propose fast-CMOLRS combining multiple triplets for each query instead of standard process using single triplet at each model update step, which further reduces the times of gradient updates and retractions. Extensive experiments are conducted on four public datasets, and comparisons with state-of-the-art methods show the effectiveness and efficiency of our approach.
机译:跨模型数据对象中的语义相似性,例如,图像和文本之间的相似性被识别为跨模型检索的瓶颈。然而,现有的批量相关学习方法在处理大规模的高维跨模型数据时遭受禁止的时间复杂性和额外的存储器消耗。在本文中,我们提出了一种跨模型在线低级相似函数学习(CMOLRS)方法,其学习用于跨模型检索的低级双线性相似度测量。我们通过训练数据三联网上的相对相似性模拟跨模型关系,并配制与凸铰链损耗相对关系。通过在特征空间和标签空间中的成对距离中调整铰链损耗的边缘,CMOLRS有效地捕获多级语义相关性,并适应跨模型数据之间的内容分歧。用低秩约束施加,相似函数通过在线学习在低级矩阵的歧管中进行培训。低级约束不仅赋予模型学习过程,速度更快,可扩展性更好,而且还提高了模型一般性。我们进一步提出了在每个模型更新步骤的每个模型更新步骤中使用单个三态组合的快速cmolrs组合每个查询而不是标准过程,而是在每个模型更新步骤中,这进一步减少了梯度更新和撤回的时间。广泛的实验在四个公共数据集中进行,与最先进的方法的比较显示了我们方法的有效性和效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号