【24h】

An Efficient Person Reid Methed Based on Knowledge Distillation

机译:基于知识提炼的高效人里德方法

获取原文

摘要

Existing Person Reid (Person Re-identification) approaches often only consider how to improve the model generalization performance, which unfortunately comes at the cost of massive computation and storage consumption. Despite that those extensive neural networks are powerful to deal with a specific task and achieve excellent results on many tasks, they are too large to be deployed on edge devices like smart-phones and security cameras. Recently, [5] (Hinton, Vinyals, and Dean 2014) have shown that the dark knowledge within a powerful and massive neural network (a.k.a. teacher) can significantly help the training of a smaller and faster student network (a.k.a. student) and improve performance of the student network. In this work, we investigate the knowledge distillation strategy for training small Person Reid networks by making use of large networks. To this end, we present the Efficient Person Reid model, which is achieved by effective knowledge transferring the dark knowledge of a powerful teacher network. Extensive experiments on Person Reid standard datasets DukeMTMC substantiates the effectiveness of our proposed approach. Particular our efficient Reid model achieves Rank-1/mAP = 86.8%/72.9% on DukeMTMC with only 256 dimensions’ feature vector and with only 3.4M Params.
机译:现有的人员Reid(人员重新识别)方法通常仅考虑如何提高模型泛化性能,不幸的是,这是以大量计算和存储消耗为代价的。尽管这些广泛的神经网络功能强大,可以处理特定任务并在许多任务上实现出色的结果,但它们太大了,无法部署在智能电话和安全摄像头等边缘设备上。最近,[5](Hinton,Vinyals和Dean,2014年)表明,强大而庞大的神经网络(又名教师)中的黑暗知识可以极大地帮助训练规模更小,速度更快的学生网络(即学生)并提高绩效。学生网络。在这项工作中,我们研究了利用大型网络训练小型Person Reid网络的知识提炼策略。为此,我们提出了有效人里德模型,该模型是通过有效的知识转移强大的教师网络的暗知识来实现​​的。在Person Reid标准数据集DukeMTMC上进行的大量实验证实了我们提出的方法的有效性。特别是我们有效的Reid模型在DukeMTMC上仅用256个维的特征向量和3.4M个参数就实现了Rank-1 / mAP = 86.8%/ 72.9%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号