...
首页> 外文期刊>Neural computing & applications >Batch gradient training method with smoothing regularization for l(0) feedforward neural networks
【24h】

Batch gradient training method with smoothing regularization for l(0) feedforward neural networks

机译:l(0)前馈神经网络的具有平滑正则化的批梯度训练方法

获取原文
获取原文并翻译 | 示例

摘要

This paper considers the batch gradient method with the smoothing regularization (BGSL0) for training and pruning feedforward neural networks. We show why BGSL0 can produce sparse weights, which are crucial for pruning networks. We prove both the weak convergence and strong convergence of BGSL0 under mild conditions. The decreasing monotonicity of the error functions during the training process is also obtained. Two examples are given to substantiate the theoretical analysis and to show the better sparsity of BGSL0 than three typical regularization methods.
机译:本文考虑了采用平滑正则化(BGSL0)的批梯度方法来训练和修剪前馈神经网络。我们展示了为什么BGSL0可以产生稀疏权重,这对于修剪网络至关重要。我们证明了在温和条件下BGSL0的弱收敛和强收敛。还获得了误差函数在训练过程中递减的单调性。给出了两个例子来证实理论分析,并显示出比三种典型的正则化方法更好的BGSL0稀疏性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号