首页> 外文会议>IEEE Statistical Signal Processing Workshop >Least-Squares Based Layerwise Pruning Of Convolutional Neural Networks
【24h】

Least-Squares Based Layerwise Pruning Of Convolutional Neural Networks

机译:基于最小二乘的卷积神经网络分层修剪

获取原文

摘要

In this paper, we propose a new layerwise pruning method to reduce the number of computations needed to evaluate convolutional neural networks (CNN) after training. This least-squares (LS) based pruning method improves state-of-the-art pruning methods as it solves both problems, how to select the feature maps to be pruned and how to adapt the remaining parameters in the kernel tensor to compensate the introduced pruning errors, jointly. Therefore, our method utilizes both correlations between the input feature maps and the structure in the kernel tensor. In experiments, we show that high reduction rates with a small performance degradation can be obtained with our pruning method and that our pruning method performs significantly better than low-rank factorization methods.
机译:在本文中,我们提出了一种新的分层修剪方法,以减少训练后评估卷积神经网络(CNN)所需的计算量。这种基于最小二乘(LS)的修剪方法改进了最新的修剪方法,因为它解决了以下两个问题:如何选择要修剪的特征图以及如何调整内核张量中的其余参数以补偿引入的误差一起修剪错误。因此,我们的方法利用了输入特征图和内核张量中的结构之间的相关性。在实验中,我们表明,使用我们的修剪方法可以获得较高的还原率,而性能下降较小,并且我们的修剪方法的性能明显优于低秩分解方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号