This paper introduces modified backpropagation algorithms fortraining multilayer feedforward neural networks with hard-limitingneurons. Transforming neuron activation functions are used in all thehidden layers, which are the modified continuous sigmoidal functionswith an adaptive steepness factors. In the training process, thissteepness factor varies from a small positive value to infinite with thedecrease of the sum square error. Thus, a multilayer feedforward neuralnetwork can be trained with the resultant architecture only composed ofhard limiting neurons. The learning algorithm is similar to theconventional backpropagation algorithm. Only the derivatives of thehidden neural activation functions are modified. Extensive numericalsimulations are presented to show the feasibility of the proposedalgorithm. In addition, the numerical properties of the proposedalgorithm are discussed and comparisons of the proposed algorithm withother algorithms are made
展开▼