首页> 外文会议>International Conference on Computational Performance Evaluation >An Incremental Pruning Strategy for Fast Training of CNN Models
【24h】

An Incremental Pruning Strategy for Fast Training of CNN Models

机译:CNN模型快速训练的增量修剪策略

获取原文
获取外文期刊封面目录资料

摘要

Deep Neural Networks have progressed significantly over the past few years and they are growing better and bigger each day. Thus, it becomes difficult to compute as well as store these over-parameterized networks. Pruning is a technique to reduce the parameter-count resulting in improved speed, reduced size and reduced computation power. In this paper, we have explored a new pruning strategy based on the technique of Incremental Pruning with less pre-training and achieved better accuracy in lesser computation time on MNIST, CIFAR-10 and CIFAR-100 datasets compared to previous related works with small decrease in compression rates. On MNIST, CIFAR-10 and CIFAR-100 datasets, the proposed technique prunes 10x faster than conventional models with similar accuracy.
机译:在过去的几年中,深度神经网络取得了长足的进步,并且每天都在变得越来越好。因此,变得难以计算和存储这些过参数化的网络。修剪是一种减少参数计数的技术,从而提高了速度,减小了尺寸并降低了计算能力。在本文中,我们探索了一种基于增量修剪技术的新修剪策略,该方法具有较少的预训练,并且与以前的相关工作相比,在较少的MNIST,CIFAR-10和CIFAR-100数据集上,可以在更短的计算时间内实现更高的准确性压缩率。在MNIST,CIFAR-10和CIFAR-100数据集上,所提出的技术比具有相似精度的传统模型的修剪速度快10倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号