首页> 外文会议>International Conference on Computational Science >Retrain or Not Retrain?-Efficient Pruning Methods of Deep CNN Networks
【24h】

Retrain or Not Retrain?-Efficient Pruning Methods of Deep CNN Networks

机译:重新训练或不重新训练?-深度CNN网络的有效修剪方法

获取原文

摘要

Nowadays, convolutional neural networks (CNN) play a major role in image processing tasks like image classification, object detection, semantic segmentation. Very often CNN networks have from several to hundred stacked layers with several megabytes of weights. One of the possible techniques to reduce complexity and memory footprint is pruning. Pruning is a process of removing weights which connect neurons from two adjacent layers in the network. The process of finding near optimal solution with specified and acceptable drop in accuracy can be more sophisticated when DL model has higher number of convolutional layers. In the paper few approaches based on retraining and no retraining are described and compared together.
机译:如今,卷积神经网络(CNN)在图像处理任务(例如图像分类,对象检测,语义分割)中起着重要作用。 CNN网络通常具有数百到数百个具有几兆字节重量的堆叠层。减少复杂性和减少内存占用的一种可能技术是修剪。修剪是去除权重的过程,该权重将神经元从网络中的两个相邻层连接起来。当DL模型具有更高数量的卷积层时,找到具有指定且可接受的精度下降的接近最佳解的过程可能会更加复杂。在本文中,很少描述和比较基于再训练和不再训练的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号