首页> 外文会议>IEEE/CVF Conference on Computer Vision and Pattern Recognition >Relation-Aware Global Attention for Person Re-Identification
【24h】

Relation-Aware Global Attention for Person Re-Identification

机译:重新认识个人的重新认识关系

获取原文

摘要

For person re-identification (re-id), attention mechanisms have become attractive as they aim at strengthening discriminative features and suppressing irrelevant ones, which matches well the key of re-id, i.e., discriminative feature learning. Previous approaches typically learn attention using local convolutions, ignoring the mining of knowledge from global structure patterns. Intuitively, the affinities among spatial positionsodes in the feature map provide clustering-like information and are helpful for inferring semantics and thus attention, especially for person images where the feasible human poses are constrained. In this work, we propose an effective Relation-Aware Global Attention (RGA) module which captures the global structural information for better attention learning. Specifically, for each feature position, in order to compactly grasp the structural information of global scope and local appearance information, we propose to stack the relations, i.e., its pairwise correlations/affinities with all the feature positions (e.g., in raster scan order), and the feature itself together to learn the attention with a shallow convolutional model. Extensive ablation studies demonstrate that our RGA can significantly enhance the feature representation power and help achieve the state-of-the-art performance on several popular benchmarks. The source code is available at https://github.com/microsoft/Relation-Aware-Global-Attention-Networks.
机译:对于人的重新识别(re-id),注意力机制变得有吸引力,因为它们旨在加强区分特征并抑制无关的特征,这与重新识别的关键即区分特征学习非常吻合。先前的方法通常使用局部卷积来学习注意力,而忽略了从全局结构模式中挖掘知识的过程。直观上,特征图中空间位置/节点之间的亲和力提供了类似聚类的信息,并且有助于推断语义并因此引起关注,特别是对于可行的人体姿势受限的人像。在这项工作中,我们提出了一个有效的“关系感知全局注意”(RGA)模块,该模块可以捕获全局结构信息以更好地学习注意。具体地,对于每个特征位置,为了紧凑地掌握全局范围的结构信息和局部外观信息,我们建议堆叠关系,即其与所有特征位置的成对相关性/亲和力(例如,按光栅扫描顺序)以及功能本身一起使用浅层卷积模型来学习注意力。广泛的消融研究表明,我们的RGA可以显着增强特征表示能力,并有助于在几种流行的基准上达到最先进的性能。源代码位于https://github.com/microsoft/Relation-Aware-Global-Attention-Networks。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号