首页> 外文期刊>IEEE Transactions on Signal Processing >Fast learning process of multilayer neural networks using recursive least squares method
【24h】

Fast learning process of multilayer neural networks using recursive least squares method

机译:递归最小二乘法的多层神经网络快速学习过程

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

A new approach for the learning process of multilayer perceptron neural networks using the recursive least squares (RLS) type algorithm is proposed. This method minimizes the global sum of the square of the errors between the actual and the desired output values iteratively. The weights in the network are updated upon the arrival of a new training sample and by solving a system of normal equations recursively. To determine the desired target in the hidden layers an analog of the back-propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Simulation results on the 4-b parity checker and multiplexer networks were obtained which indicate significant reduction in the total number of iterations when compared with those of the conventional and accelerated back-propagation algorithms.
机译:提出了一种基于递归最小二乘(RLS)算法的多层感知器神经网络学习过程的新方法。该方法迭代地最小化了实际和期望输出值之间的误差平方的总和。网络中的权重在新训练样本到达时并通过递归求解正态方程组进行更新。为了确定隐藏层中的期望目标,开发了传统学习算法中使用的反向传播策略的模拟。这允许将学习过程应用于所有层。在4-b奇偶校验器和多路复用器网络上获得的仿真结果表明,与传统算法和加速反向传播算法相比,迭代总数显着减少。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号