首页> 外文期刊>IEEE Transactions on Neural Networks >The layer-wise method and the backpropagation hybrid approach to learning a feedforward neural network
【24h】

The layer-wise method and the backpropagation hybrid approach to learning a feedforward neural network

机译:学习前馈神经网络的分层方法和反向传播混合方法

获取原文
获取原文并翻译 | 示例

摘要

Feedforward neural networks (FNNs) have been proposed to solve complex problems in pattern recognition and classification and function approximation. Despite the general success of learning methods for FNNs, such as the backpropagation (BP) algorithm, second-order optimization algorithms and layer-wise learning algorithms, several drawbacks remain to be overcome. In particular, two major drawbacks are convergence to a local minima and long learning time. We propose an efficient learning method for a FNN that combines the BP strategy and optimization layer by layer. More precisely, we construct the layer-wise optimization method using the Taylor series expansion of nonlinear operators describing a FNN and propose to update weights of each layer by the BP-based Kaczmarz iterative procedure. The experimental results show that the new learning algorithm is stable, it reduces the learning time and demonstrates improvement of generalization results in comparison with other well-known methods.
机译:为了解决模式识别,分类和函数逼近中的复杂问题,提出了前馈神经网络(FNN)。尽管针对FNN的学习方法(如反向传播(BP)算法,二阶优化算法和逐层学习算法)取得了普遍成功,但仍有一些缺点需要克服。特别是,两个主要缺点是收敛到局部最小值和学习时间长。我们为FNN提出了一种高效的学习方法,该方法将BP策略和优化逐层结合。更准确地说,我们使用描述FNN的非线性算子的泰勒级数展开构造逐层优化方法,并提出通过基于BP的Kaczmarz迭代过程来更新每层的权重。实验结果表明,新的学习算法是稳定的,它减少了学习时间,并且与其他众所周知的方法相比,证明了泛化结果的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号