首页> 外文期刊>Neurocomputing >Progressive principle component analysis for compressing deep convolutional neural networks
【24h】

Progressive principle component analysis for compressing deep convolutional neural networks

机译:压缩深卷积神经网络的渐进原理分量分析

获取原文
获取原文并翻译 | 示例
           

摘要

In this work, we propose a progressive principal component analysis (PPCA) method for compressing deep convolutional neural networks. The proposed method starts with a prespecified layer and progressively moves on to the final output layer. For each target layer, PPCA conducts kernel principal component analysis for the estimated kernel weights. This leads to a significant reduction in the number of kernels in the current layer. As a consequence, the channels used for the next layer are also reduced substantially. This is because the number of kernels used in the current layer determines the number of channels for the next layer. For convenience, we refer to this as a progressive effect. As a consequence, the entire model structure can be substantially compressed, and both the number of parameters and the inference costs can be substantially reduced. Meanwhile, the prediction accuracy remains very competitive with respect to that of the baseline model. The effectiveness of the proposed method is evaluated on a number of classical CNNs (AlexNet, VGGNet, ResNet and MobileNet) and benchmark datasets. The empirical findings are very encouraging. The code is available at https://github.com/zhoujing89/ ppca.(c) 2021 Published by Elsevier B.V.
机译:在这项工作中,我们提出了一种用于压缩深卷积神经网络的渐进主成分分析(PPCA)方法。所提出的方法以预定的层开始,并逐渐移动到最终输出层。对于每个目标层,PPCA对估计的内核重量进行核心分组分析。这导致电流层中核数的显着降低。结果,用于下一层的通道显着减少。这是因为当前层中使用的内核数量确定下一层的通道数。为方便起见,我们将此称为渐进效果。结果,可以基本上压缩整个模型结构,并且可以大大减少参数的数量和推理成本。同时,对基线模型的预测精度保持非常竞争力。所提出的方法的有效性在许多古典CNN(AlexNet,VgGnet,Reset和MobileNet)和基准数据集上进行评估。经验研究结果非常令人鼓舞。代码可在https://github.com/zhoujing89/ ppca上获得。(c)2021由elestvier b.v发布。

著录项

  • 来源
    《Neurocomputing》 |2021年第14期|197-206|共10页
  • 作者单位

    Renmin Univ China Sch Stat Ctr Appl Stat Beijing 100872 Peoples R China;

    Peking Univ Guanghua Sch Management Beijing 100871 Peoples R China;

    Peking Univ Guanghua Sch Management Beijing 100871 Peoples R China;

    Peking Univ Guanghua Sch Management Beijing 100871 Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    CNN compression; model acceleration; progressive PCA; kernel-wise reduction;

    机译:CNN压缩;模型加速;渐进式PCA;核心减少;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号