首页> 外文期刊>Signal processing >Channel pruning based on mean gradient for accelerating Convolutional Neural Networks
【24h】

Channel pruning based on mean gradient for accelerating Convolutional Neural Networks

机译:基于平均梯度加速卷积神经网络的频道修剪

获取原文
获取原文并翻译 | 示例

摘要

Convolutional Neural Networks (CNNs) are getting deeper and wider to improve their performance and thus increase their computational complexity. We apply channel pruning methods to accelerate CNNs and reduce its computational consumption. A new pruning criterion is proposed based on the mean gradient for convolutional kernels. To significantly reduce Float Point Operations (FLOPs) of CNNs, a hierarchical global pruning strategy is introduced. In each pruning step, the importance of convolutional kernels is evaluated by the mean gradient criterion. Hierarchical global pruning strategy is adopted to remove less important kernels, and get a smaller CNN model. Finally we fine-tune the model to restore network performance. Experimental results show that VGG-16 network pruned by channel pruning on CIFAR-10 achieves 5.64 x reduction in FLOPs with less than 1% decrease in accuracy. Meanwhile ResNet-110 network pruned on CIFAR-10 achieves 2.48 x reduction in FLOPs and parameters with only 0.08% decrease in accuracy. (C) 2018 Elsevier B.V. All rights reserved.
机译:卷积神经网络(CNNS)正在深入增强,以提高其性能,从而提高其计算复杂性。我们应用频道修剪方法以加速CNN并降低其计算消耗。基于卷积粒的平均梯度提出了新的修剪标准。为了显着减少CNN的浮点数(拖鞋),介绍了分层全局修剪策略。在每个修剪步骤中,通过平均梯度标准评估卷积核的重要性。采用分层全球修剪策略来消除不太重要的内核,并获得较小的CNN模型。最后我们微调模型来恢复网络性能。实验结果表明,通过CIFAR-10对信道修剪修剪的VGG-16网络达到了5.64 x的拖鞋,精度降低了1%。同时,CIFAR-10上修剪的Reset-110网络在拖鞋和参数上降低了2.48 x,精度下降0.08%。 (c)2018年elestvier b.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号