首页> 外文会议>International Conference on Pattern Recognition >PHNet: Parasite-Host Network for Video Crowd Counting
【24h】

PHNet: Parasite-Host Network for Video Crowd Counting

机译:PHNET:寄生铁矿 - 录像机的主机网络

获取原文

摘要

Crowd counting plays an increasingly important role in public security. Recently, many crowd counting methods for a single image have been proposed but few studies have focused on using temporal information from image sequences of videos to improve prediction performance. In the existing methods using videos for crowd estimation, temporal features and spatial features are modeled jointly for the prediction, which makes the model less efficient in extracting spatiotemporal features and difficult to improve the performance of predictions. In order to solve these problems, this paper proposes a Parasite-Host Network (PHNet) which is composed of Parasite branch and Host branch to extract temporal features and spatial features respectively. To specifically extract the transform features in the time domain, we propose a novel architecture termed as “Relational Extractor”(RE) which models the multiplicative interaction features of adjacent frames. In addition, the Host branch extracts the spatial features from a current frame which can be replaced with any model that uses a single image for the prediction. We conducted experiments by using our PHNet on four video crowd counting benchmarks: Venice, UCSD, FDST and CrowdFlow. Experimental results show that PHNet achieves superior performance on these four datasets to the state-of-the-art methods.11The code is at https://github.com/LeeJAJA/PHNet-pytorch.
机译:人群计数在公共安全中起着越来越重要的作用。最近,已经提出了许多用于单个图像的人群计数方法,但很少有研究专注于使用视频图像序列的时间信息来提高预测性能。在使用人群估计的视频的现有方法中,时间特征和空间特征是共同建模的预测,这使得模型在提取时空特征中提取效率并且难以提高预测性能。为了解决这些问题,本文提出了由寄生铁矿分支和主机分支组成的寄生铁矿 - 主机网络(PHNET),以分别提取时间特征和空间特征。为了具体地提取时域中的变换特征,我们提出了一种称为“关系提取器”(RE)的新颖体系结构,其模拟相邻帧的乘法交互功能。另外,主机分支从当前帧提取空间特征,该空间特征可以用使用用于预测的单个图像的任何模型替换。我们通过使用我们的PHNET在四个视频人群计数基准中进行实验:威尼斯,UCSD,FDST和Crowdflow。实验结果表明,PHNET在这四个数据集中实现了卓越的性能,以最先进的方法。 1 1 代码是https://github.com/leejaja/phet-pytorch。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号