首页> 外文会议>Electrical and Computer Engineering, 1993. Canadian Conference on >Modified backpropagation algorithms for training the multilayer feedforward neural networks with hard-limiting neurons
【24h】

Modified backpropagation algorithms for training the multilayer feedforward neural networks with hard-limiting neurons

机译:改进的反向传播算法,用于使用硬限制神经元训练多层前馈神经网络

获取原文

摘要

This paper introduces modified backpropagation algorithms for training multilayer feedforward neural networks with hard-limiting neurons. Transforming neuron activation functions are used in all the hidden layers, which are the modified continuous sigmoidal functions with an adaptive steepness factors. In the training process, this steepness factor varies from a small positive value to infinite with the decrease of the sum square error. Thus, a multilayer feedforward neural network can be trained with the resultant architecture only composed of hard limiting neurons. The learning algorithm is similar to the conventional backpropagation algorithm. Only the derivatives of the hidden neural activation functions are modified. Extensive numerical simulations are presented to show the feasibility of the proposed algorithm. In addition, the numerical properties of the proposed algorithm are discussed and comparisons of the proposed algorithm with other algorithms are made.
机译:本文介绍了改进的反向传播算法,用于训练具有硬限制神经元的多层前馈神经网络。在所有隐藏层中都使用了转换神经元激活函数,这些函数是经过修改的连续S形函数,具有自适应陡度因子。在训练过程中,随着平方和误差的减小,陡度因子从小正值变化到无穷大。因此,多层前馈神经网络可以用仅由硬限制神经元组成的最终体系进行训练。学习算法类似于常规的反向传播算法。仅隐藏神经激活函数的导数被修改。大量的数值模拟表明了该算法的可行性。此外,讨论了该算法的数值性质,并与其他算法进行了比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号