【24h】

Slimming ResNet by Slimming Shortcut

机译:通过减肥快捷方式减肥reset

获取原文

摘要

Conventional network pruning methods on convolutional neural networks (CNNs) reduce the number of input or output channels of convolution layers. With these approaches, the channels in the plain network can be pruned without any restrictions. However, in the case of the ResNet based networks which have shortcuts (skip connections), the channel slimming of existing pruning methods is limited to the inside of each residual block. Since the number of Flops and parameters are also highly related to the number of channels in the shortcuts, more investigation on pruning channels in shortcuts is required. In this paper, we propose a novel pruning method, Slimming Shortcut Pruning (SSPruning), for pruning channels in shortcuts on ResNet based networks. First, we separate the long shortcut into individual regions that can be pruned independently without considering its long connections. Then, by applying our Importance Learning Gate (ILG) which learns the importance of channels globally regardless of channel type and location (i.e., in the shortcut or inside of the block), we can finally achieve an optimally pruned model. Through various experiments, we have confirmed that our method yields outstanding results when we prune the shortcuts and inside of the block together.
机译:卷积神经网络(CNNS)的传统网络修剪方法减少了卷积层的输入或输出通道的数量。通过这些方法,可以在没有任何限制的情况下修剪普通网络中的通道。然而,在具有快捷方式的基于Reset的网络(跳过连接)的情况下,现有修剪方法的信道减肥仅限于每个残差块的内部。由于拖鞋和参数的数量也与快捷方式中的通道数量高,因此需要更多关于快捷方式中的修剪通道的研究。在本文中,我们提出了一种新颖的修剪方法,减肥捷径修剪(SSPRUNING),用于基于RESET网络的快捷方式中的修剪通道。首先,我们将长捷径分开到可以独立修剪的单个区域,而无需考虑其长连接。然后,通过应用我们的重要学习门(ILG),该门(ILG)无论信道类型和位置如何(即,在块的快捷方式中),我们都可以实现最佳修剪的模型。通过各种实验,我们已经证实,当我们将速度和块内部一起修剪快捷方式和块内部,我们的方法产生了出色的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号