...
首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >A pruning method for the recursive least squared algorithm.
【24h】

A pruning method for the recursive least squared algorithm.

机译:递归最小二乘算法的修剪方法。

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

The recursive least squared (RLS) algorithm is an effective online training method for neural networks. However, its conjunctions with weight decay and pruning have not been well studied. This paper elucidates how generalization ability can be improved by selecting an appropriate initial value of the error covariance matrix in the RLS algorithm. Moreover, how the pruning of neural networks can be benefited by using the final value of the error covariance matrix will also be investigated. Our study found that the RLS algorithm is implicitly a weight decay method, where the weight decay effect is controlled by the initial value of the error covariance matrix; and that the inverse of the error covariance matrix is approximately equal to the Hessian matrix of the network being trained. We propose that neural networks are first trained by the RLS algorithm and then some unimportant weights are removed based on the approximate Hessian matrix. Simulation results show that our approach is an effective training and pruning method for neural networks.
机译:递归最小二乘(RLS)算法是一种有效的神经网络在线训练方法。然而,其与体重下降和修剪的结合尚未得到很好的研究。本文阐述了如何通过在RLS算法中选择适当的误差协方差矩阵初始值来提高泛化能力。此外,还将研究如何通过使用误差协方差矩阵的最终值来受益于神经网络的修剪。我们的研究发现,RLS算法是一种隐式权重衰减方法,其中权重衰减效果由误差协方差矩阵的初始值控制。误差协方差矩阵的逆近似等于训练网络的Hessian矩阵。我们建议先通过RLS算法训练神经网络,然后基于近似Hessian矩阵去除一些不重要的权重。仿真结果表明,该方法是一种有效的神经网络训练和修剪方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号