In this paper, a new adaptive learning rate algorithm to train a single hidden layer neural network is proposed. The adaptive learning rate is derived by differentiating linear and nonlinear errors and functional constraints weight decay term at hidden layer and penalty term at output layer. Since the adaptive learning rate calculation involves first order derivative of linear and nonlinear errors and second order derivatives of functional constraints, the proposed algorithm converges quickly. Simulation results show the advantages of proposed algorithm.
展开▼