首页> 外文会议>Electrical and Computer Engineering, 1993. Canadian Conference on >Modified backpropagation algorithms for training the multilayerfeedforward neural networks with hard-limiting neurons
【24h】

Modified backpropagation algorithms for training the multilayerfeedforward neural networks with hard-limiting neurons

机译:改进的反向传播算法,用于训练多层具有硬限制神经元的前馈神经网络

获取原文

摘要

This paper introduces modified backpropagation algorithms fortraining multilayer feedforward neural networks with hard-limitingneurons. Transforming neuron activation functions are used in all thehidden layers, which are the modified continuous sigmoidal functionswith an adaptive steepness factors. In the training process, thissteepness factor varies from a small positive value to infinite with thedecrease of the sum square error. Thus, a multilayer feedforward neuralnetwork can be trained with the resultant architecture only composed ofhard limiting neurons. The learning algorithm is similar to theconventional backpropagation algorithm. Only the derivatives of thehidden neural activation functions are modified. Extensive numericalsimulations are presented to show the feasibility of the proposedalgorithm. In addition, the numerical properties of the proposedalgorithm are discussed and comparisons of the proposed algorithm withother algorithms are made
机译:本文介绍了改进的反向传播算法。 用硬限制训练多层前馈神经网络 神经元。转化神经元激活功能用于所有 隐藏层,它们是修改后的连续S形函数 具有适应性的陡度因子。在训练过程中 陡度因数从小正值到无穷大,随 减小平方和误差。因此,多层前馈神经 可以使用仅由以下组成的最终架构来训练网络 硬限制神经元。学习算法类似于 传统的反向传播算法。只有 隐藏的神经激活功能被修改。广泛的数值 仿真结果表明了该方法的可行性。 算法。另外,所提出的数值性质 讨论了算法,并与所提出算法进行了比较。 其他算法

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号