首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Deep Coattention-Based Comparator for Relative Representation Learning in Person Re-Identification
【24h】

Deep Coattention-Based Comparator for Relative Representation Learning in Person Re-Identification

机译:基于深度的基于关联的比较器,用于相对代表学习的人重新识别

获取原文
获取原文并翻译 | 示例

摘要

Person re-identification (re-ID) favors discriminative representations over unseen shots to recognize identities in disjoint camera views. Effective methods are developed via pair-wise similarity learning to detect a fixed set of region features, which can be mapped to compute the similarity value. However, relevant parts of each image are detected independently without referring to the correlation on the other image. Also, region-based methods spatially position local features for their aligned similarities. In this article, we introduce the deep coattention-based comparator (DCC) to fuse codependent representations of paired images so as to correlate the best relevant parts and produce their relative representations accordingly. The proposed approach mimics the human foveation to detect the distinct regions concurrently across images and alternatively attends to fuse them into the similarity learning. Our comparator is capable of learning representations relative to a test shot and well-suited to reidentifying pedestrians in surveillance. We perform extensive experiments to provide the insights and demonstrate the state of the arts achieved by our method in benchmark data sets: 1.2 and 2.5 points gain in mean average precision (mAP) on DukeMTMC-reID and Market-1501, respectively.
机译:人重新识别(RE-ID)有利于未经看不见的截图的歧视性表示,以识别不相称相机视图中的身份。通过成对相似度学习开发有效方法来检测固定的区域特征,可以映射以计算相似值。然而,独立地检测每个图像的相关部分,而不参考其他图像上的相关性。此外,基于区域的方法为它们对准的相似性空间上定位局部特征。在本文中,我们介绍了基于深度的基于传感的比较器(DCC),以融合成对图像的融合表示,以便将最佳相关部分相关联并相应地产生它们的相对表示。所提出的方法模仿人类的污垢,以跨越图像同时检测不同的区域,并且替代地参加将它们融化为相似度学习。我们的比较者能够学习相对于测试镜头的陈述,并适合在监视中重新入住行人。我们执行广泛的实验,以提供洞察力,并展示我们在基准数据集中实现的方法所实现的技术:1.2和2.5点在Dukemtmc-Reid和市场-1501上的平均平均精度(地图)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号