...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >Deep-Person: Learning discriminative deep features for person Re-Identification
【24h】

Deep-Person: Learning discriminative deep features for person Re-Identification

机译:深处:学习人员重新识别的歧视性深度特征

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Person re-identification (Re-ID) requires discriminative features focusing on the full person to cope with inaccurate person bounding box detection, background clutter, and occlusion. Many recent person Re-ID methods attempt to learn such features describing full person details via part-based feature representation. However, the spatial context between these parts is ignored for the independent extractor on each separate part. In this paper, we propose to apply Long Short-Term Memory (LSTM) in an end-to-end way to model the pedestrian, seen as a sequence of body parts from head to foot. Integrating the contextual information strengthens the discriminative ability of local feature aligning better to full person. We also leverage the complementary information between local and global feature. Furthermore, we integrate both identification task and ranking task in one network, where a discriminative embedding and a similarity measurement are learned concurrently. This results in a novel three-branch framework named Deep-Person, which learns highly discriminative features for person Re-ID. Experimental results demonstrate that Deep-Person outperforms the state-of-the-art methods by a large margin on three challenging datasets including Market-1501, CUHK03, and DukeMTMC-reID. (C) 2019 Elsevier Ltd. All rights reserved.
机译:人重新识别(RE-ID)需要关注全人的歧视特征,以应对不准确的人边界箱检测,背景杂乱和闭塞。许多最近的人重新ID方法试图通过基于零件的特征表示来学习描述全人细节的这些功能。但是,对于每个单独的部分上的独立提取器,忽略这些部件之间的空间上下文。在本文中,我们建议以端到端的方式应用长短短期记忆(LSTM)来模拟行人,从头到脚看一系列身体部位。整合上下文信息加强了局部特征对全年人员对齐的判别能力。我们还利用本地和全球功能之间的互补信息。此外,我们在一个网络中集成了识别任务和排名任务,其中同时学习鉴别的嵌入和相似度测量。这导致了一个名为Deep-Pands的新型三分支框架,其为人员重新ID提供高度辨别的功能。实验结果表明,深层人员在包括市场-1501,CUHK03和Dukemtmc-Reid的三个具有挑战性的数据集中的大幅度优于最先进的方法。 (c)2019年elestvier有限公司保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号