首页> 外文期刊>Cognitive Neurodynamics >Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks
【24h】

Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks

机译:基于混沌注入的梯度法训练前馈神经网络的确定性收敛性

获取原文
获取原文并翻译 | 示例
       

摘要

It has been shown that, by adding a chaotic sequence to the weight update during the training of neural networks, the chaos injection-based gradient method (CIBGM) is superior to the standard backpropagation algorithm. This paper presents the theoretical convergence analysis of CIBGM for training feedforward neural networks. We consider both the case of batch learning as well as the case of online learning. Under mild conditions, we prove the weak convergence, i.e., the training error tends to a constant and the gradient of the error function tends to zero. Moreover, the strong convergence of CIBGM is also obtained with the help of an extra condition. The theoretical results are substantiated by a simulation example.
机译:已经表明,通过在神经网络的训练过程中向权重更新添加混沌序列,基于混沌注入的梯度方法(CIBGM)优于标准的反向传播算法。本文介绍了用于训练前馈神经网络的CIBGM的理论收敛性分析。我们同时考虑批处理学习和在线学习的情况。在温和条件下,我们证明了收敛性较弱,即训练误差趋于恒定且误差函数的梯度趋于零。此外,借助额外条件,也可以获得CIBGM的强收敛性。理论结果通过一个仿真实例得到证实。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号