首页> 外文会议>Artificial neural nets and genetic algorithms >Rates of Learning in Gradient and Genetic Training of Recurrent Neural Networks
【24h】

Rates of Learning in Gradient and Genetic Training of Recurrent Neural Networks

机译:递归神经网络的梯度学习和遗传训练的学习率

获取原文
获取原文并翻译 | 示例

摘要

In this paper, gradient descent and genetic techniques are used for on-line training of recurrent neural networks. A singular perturbation model for gradient learning of fixed points introduces the problem of the rate of learning, formulated as the relative speed of evolution of the netowrk and the adaptation process, and motivates an analogous study when genetic training is used. The existence of bounds for the rate of learning in order to guarantee convergence is obtained in both gradient and genetic training. Some computer simulations confirm theoretical predictions.
机译:在本文中,梯度下降和遗传技术用于递归神经网络的在线训练。用于定点梯度学习的奇异摄动模型引入了学习速率的问题,公式化为网络进化的相对速度和适应过程,并在使用遗传训练时激发了类似的研究。在梯度和遗传训练中都获得了学习速率的界线,以确保收敛。一些计算机模拟证实了理论预测。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号