首页> 外文会议>International conference on neural information processing >Comparing Adaptive and Non-Adaptive Connection Pruning With Pure Early Stopping
【24h】

Comparing Adaptive and Non-Adaptive Connection Pruning With Pure Early Stopping

机译:比较自适应和非自适应连接修剪与纯粹的早期停止

获取原文

摘要

Neural network pruning methods on the level of individual network parameters (e.g. connection weights) can improve generalization, as is shown in this empirical study. However, an open problem in the puning methods known today (OBD, OBS, autoprune, epsiprune) is the selection of the number of parameters to be removed in each pruning step (pruning strength). This work presents a pruning method lprune that automatically adapts the pruning strength to the evolution of weights and loss of generalization during training. The method requries no algorithm parameter adjustment by th euser. Results of statistical given, based on extensive experimentation with 14 different problems. The results indicate that training with pruning without pruning. Furthermore, lprune is often superior to autoprune (which is superior to OBD) on diagnosis tasks unless severe pruning early in the training process is required.
机译:在个人网络参数(例如连接权重)的水平上的神经网络修剪方法可以提高泛化,如本实证研究所示。然而,今天已知的庞大方法(OBD,OB,自动,epsiprune)是在每个修剪步骤(修剪强度)中选择要除去的参数的数量。这项工作提出了一种修剪的方法,它是一种自动适应训练期间重量和泛化丧失的修剪力量。该方法还要通过TH Euser进行算法参数调整。基于大量实验的统计结果具有14个不同问题。结果表明,没有修剪的修剪训练。此外,除非需要在训练过程早期严重修剪,否则LPrune通常优于自动缓解(其优于OBD)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号