首页> 中文期刊> 《计算机系统应用》 >基于双向关系相似度函数学习的行人再识别

基于双向关系相似度函数学习的行人再识别

             

摘要

当前的行人再识别在度量学习上采用马氏距离相似度函数, 该相似度函数只与特征差分空间有关, 忽略了一对行人图像中每个个体的外观特征, 针对上述问题, 提出了通过学习一个双向关系相似度函数(Bidirectional Relationship Similarity Function, BRSF), 来计算一对行人图像的相似度. BRSF不但描述了一对行人图像的互相关关系, 而且关联了一对行人图像的自相关关系. 该文利用KISSME (Keep It Simple and Straightforward Metric)算法的思想进行相似度函数学习, 把一对样本特征的自相关关系和互相关关系用高斯分布来表示, 通过把最终高斯分布的比值转换为BRSF的形式, 得到一个对背景、视角、姿势的变化具有鲁棒性的相似度函数. 在VIPeR, QMUL GRID两个行人再识别数据集上的实验结果表明, 本文算法具有较高的识别率, 其中在VIPeR数据集上, Rank1达到了53.21%.%These dominant algorithms to learn a similarity is the metric learning that learns a Mahalanobis Similarity Function (MSF) to estimate the similarity of a pair of persons. However, the MSF only projects a pair of persons into feature difference space and ignores the appearance of each individual. In this study, we proposed to learn a Bidirectional Relationship Similarity Function (BRSF) that greatly strengthens the modeling ability of the similarity function. BRSF not only represents the cross correlation relationship of a pair of persons, but also describes the auto correlation relationship. We use the ideal of the Keep It Simple Straightforward Metric (KISSME) algorithm to learn a similarity function. Specifically, the auto correlation relationship and cross correlation relationship of a pair of sample features are expressed by Gaussian distribution. Finally, by converting the ratio of the final Gaussian distribution into the form of BRSF, we get a similarity function which is robust to the change of background, viewpoint, and posture. The proposed method is demonstrated on two public benchmark datasets including VIPeR and QMUL GRID, and experimental results show that the proposed method achieves excellent re-identification rates compared with other similar algorithms. Moreover, the re-identification results on the VIPeR dataset with half of dataset sampled as training samples are quantitatively analyzed, and the performance of the proposed method achieves a 53.21% at Rank1 (represents the correct matched pair).

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号