首页> 外文会议>International Conference on Control, Automation, Robotics Vision >Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks
【24h】

Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks

机译:基于批量归一化的软滤波器修剪,用于深卷积神经网络

获取原文

摘要

As convolutional neural network contains many redundant parameters, a lot of methods have been developed to compress the network for accelerating inference. Among these, network pruning, which is a kind of widely used approaches, can effectively decrease the memory capacity and reduce the computation cost. Herein, we propose a competitive pruning approach based on Soft Filter Pruning (SFP) by taking account of the scaling factors y of Batch Normalization (BN) layers as the criterion of filter selection strategy. During the soft pruning procedure, in each epoch only y values of BN layers less than threshold are set to zero instead of setting the weights of selected filters in convolutional layers to zero. Compared to the existing approaches, the proposed method can obtain a highly increased accuracy on image recognition. Notably, on CIFAR-10, the proposed method reduces the same 40.8% FLOPs as SFP on ResNet-110 with even 0.87% top-1 accuracy improvement.
机译:由于卷积神经网络包含许多冗余参数,已经开发了许多方法来压缩网络以加速推断。其中,网络修剪是一种广泛使用的方法,可以有效地降低存储器容量并降低计算成本。在此,我们通过考虑批量归一化(BN)层作为滤波器选择策略的标准,提出了一种基于软滤波器修剪(SFP)的竞争预制方法。在软修剪过程中,在每个epoch中,Bn层的y值小于阈值被设置为零,而不是将卷积层中的所选滤波器的权重设定为零。与现有方法相比,所提出的方法可以获得对图像识别的高度提高的准确性。值得注意的是,在CiFar -10上,所提出的方法在Resnet-110上的SFP减少了相同的40.8%拖鞋,甚至的前1个精度改进0.87%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号