首页> 外文OA文献 >An enhanced training algorithm for multilayer neural networks based on reference output of hidden layer
【2h】

An enhanced training algorithm for multilayer neural networks based on reference output of hidden layer

机译:基于隐层参考输出的多层神经网络增强训练算法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this paper, the authors propose a new training algorithm which does not only rely upon the training samples, but also depends upon the output of the hidden layer. We adjust both the connecting weights and outputs of the hidden layer based on Least Square Backpropagation (LSB) algorithm. A set of ‘required’ outputs of the hidden layer is added to the input sets through a feedback path to accelerate the convergence speed. The numerical simulation results have demonstrated that the algorithm is better than conventional BP, Quasi-Newton BFGS (an alternative to the conjugate gradient methods for fast optimisation) and LSB algorithms in terms of convergence speed and training error. The proposed method does not suffer from the drawback of the LSB algorithm, for which the training error cannot be further reduced after three iterations.
机译:在本文中,作者提出了一种新的训练算法,该算法不仅依赖于训练样本,而且还依赖于隐藏层的输出。我们基于最小二乘反向传播(LSB)算法调整隐藏层的连接权重和输出。一组隐藏层的“必需”输出通过反馈路径添加到输入集中,以加快收敛速度​​。数值仿真结果表明,该算法在收敛速度和训练误差方面优于传统的BP,拟牛顿BFGS(快速优化共轭梯度法的替代方法)和LSB算法。所提出的方法不受LSB算法的缺点的困扰,对于LSB算法,在经过三次迭代之后,不能进一步减小训练误差。

著录项

  • 作者

    Li, Yan; Rad, A. B.; Wen, Peng;

  • 作者单位
  • 年度 1999
  • 总页数
  • 原文格式 PDF
  • 正文语种
  • 中图分类
  • 入库时间 2022-08-20 20:30:07

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号