...
首页> 外文期刊>Neurocomputing >Consecutive layer collaborative filter similarity for differentiable neural network pruning
【24h】

Consecutive layer collaborative filter similarity for differentiable neural network pruning

机译:Consecutive layer collaborative filter similarity for differentiable neural network pruning

获取原文
获取原文并翻译 | 示例
           

摘要

Filter pruning is proven to be an effective strategy in model compression. However, convolutional filter pruning methods usually pay all attention to evaluating filters' importance at a single layer, ignoring their collaborative relationship with corresponding filters of the next layer. In this paper, we propose novel consecutive layer collaborative filter similarity (CLCS) to make full use of the complete filter information and learn binary selection vectors to prune the redundant filters automatically. With learned selection vectors, the pruning ratio of each layer can be determined, and we can also calculate the FLOPs of the can-didate pruned network at the current stage. Under the accuracy constraint and the FLOPs constraint, the selection vectors of each layer can be optimized to achieve a better trade-off between accuracy and effi-ciency. Extensive experiments on CIFAR-10 and ImageNet with multiple networks demonstrate the effec-tiveness of our proposed method. Specifically, we obtain 54.29% and 67.33% FLOPs reduction with 0.01% and 0.09% accuracy improvement for ResNet-56 and ResNet-110 on CIFAR-10, respectively. On ImageNet, we reduce FLOPs by nearly half compared to the ResNet-50 baseline with almost no loss of accuracy. Compared with state-of-the-art filter pruning methods, our approach also achieves superior results. (c) 2023 Elsevier B.V. All rights reserved.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号