首页> 外文期刊>IEEE Transactions on Neural Networks >A simple procedure for pruning back-propagation trained neural networks
【24h】

A simple procedure for pruning back-propagation trained neural networks

机译:修剪反向传播训练的神经网络的简单过程

获取原文
获取原文并翻译 | 示例

摘要

The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list. Unlike previous approaches, this simple procedure does not require a modification of the cost function, does not interfere with the learning process, and demands a negligible computational overhead.
机译:估计了全局误差(成本)函数对人工神经网络中每个突触的包含/排除的敏感性。引入了阴影阵列,它们可以在单次反向传播学习过程中跟踪突触权重的增量变化。然后,通过降低敏感度数来对突触进行排序,以便可以通过丢弃排序列表的最后一项来有效地修剪网络。与以前的方法不同,此简单过程不需要修改成本函数,不会干扰学习过程,并且所需的计算开销可忽略不计。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号