首页> 外文期刊>Pattern recognition letters >CMIR-NET : A deep learning based model for cross-modal retrieval in remote sensing
【24h】

CMIR-NET : A deep learning based model for cross-modal retrieval in remote sensing

机译:CMIR-NET:遥感中的跨模态检索的深度学习模型

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

We address the problem of cross-modal information retrieval in the domain of remote sensing. In particular, we are interested in two application scenarios: i) cross-modal retrieval between panchromatic (PAN) and multispectral imagery, and ii) multi-label image retrieval between very high resolution (VHR) images and speech-based label annotations. These multi-modal retrieval scenarios are more challenging than the traditional uni-modal retrieval approaches given the inherent differences in distributions between the modalities. However, with the increasing availability of multi-source remote sensing data and the scarcity of enough semantic annotations, the task of multi-modal retrieval has recently become extremely important. In this regard, we propose a novel deep neural network-based architecture that is considered to learn a discriminative shared feature space for all the input modalities, suitable for semantically coherent information retrieval. Extensive experiments are carried out on the benchmark large-scale PAN - multispectral DSRSID dataset and the multi-label UC-Merced dataset. Together with the Merced dataset, we generate a corpus of speech signals corresponding to the labels. Superior performance with respect to the current state-of-the-art is observed in all the cases. (C) 2020 Elsevier B.V. All rights reserved.
机译:我们解决了遥感域中的跨模型信息检索问题。特别是,我们对两种应用方案感兴趣:i)跨越模态检索在Panchromatic(PAN)和多光谱图像之间,II)在非常高分辨率(VHR)图像和基于语音的标签注释之间的多标签图像检索。由于鉴于模式之间的分布中的固有差异,这些多模态检索方案比传统的UNI模态检索方法更具挑战性。然而,随着多源遥感数据的可用性以及足够的语义注释的稀缺性,多模态检索的任务最近变得非常重要。在这方面,我们提出了一种新的深度神经网络的架构,被认为是为所有输入方式学习判别共享特征空间,适用于语义相干信息检索。在基准大型PAN - MultiSpectral DSRSID数据集和多标签UC-Merced DataSet上进行了广泛的实验。与Merced DataSet一起,我们生成与标签对应的语音信号的语料库。在所有情况下,观察到关于当前最先进的最先进的优越性。 (c)2020 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号