首页> 外文期刊>Journal of supercomputing >Entropy-based pruning method for convolutional neural networks
【24h】

Entropy-based pruning method for convolutional neural networks

机译:基于卷积神经网络的熵的修剪方法

获取原文
获取原文并翻译 | 示例

摘要

Various compression approaches including pruning techniques have been developed to lighten the computational complexity of neural networks. Most pruning techniques determine the threshold of pruning weights or input features based on statistical analysis of the value of weights after completing their training. Their compression performance is limited because they do not take into account the contribution of weights to output during training. To solve this problem, we propose an entropy-based pruning technique that determines the threshold by considering the average amount of information from the weights to output while training. In the experiment section, we demonstrate and analyze our method for a convolutional neural network image classifier modeled by using Mixed National Institute of Standards and Technology image data. From the experimental results, our technique shows that compression performance has improved by more than 28% overall, compared to the well-known pruning technique. Also, the pruning speed has improved by 14%.
机译:已经开发了各种压缩方法,包括修剪技术以减轻神经网络的计算复杂性。大多数修剪技术基于完成训练后重量值的统计分析确定修剪重量或输入特征的阈值。它们的压缩性能是有限的,因为他们没有考虑在训练期间输出权重的贡献。为了解决这个问题,我们提出了一种基于熵的修剪技术,通过考虑在训练时从权重的平均信息量来确定阈值。在实验部分中,我们通过使用混合国家标准和技术图像数据研究所建模的卷积神经网络图像分类器来证明和分析我们的方法。根据实验结果,我们的技术表明,与知名修剪技术相比,压缩性能的整体压缩性能提高了28%以上。此外,修剪速度提高了14%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号