In this paper, a novel fast training algorithm which integrates the Back Propagation algorithm (BP) and least-squares method (LSM), called IBPLSM for muitilayer perceptrons, is proposed. Back Propagation algorithm is currently the most widely used learning algorithm in artificial neural networks. With properly selection of feed-forward neural network architecture and parameters, it is capable of approximating most problems with high accuracy and generalization ability. It is known that the slow convergence and easy trapping into local minimum are two main drawbacks when using the well-known Back Propagation learning algorithm in many applications. Furthermore, the least-squares method is also an efficient method for training muitilayer perceptrons because of its fast convergence characteristics. However, this method is not suitable for training large-scale neural networks. In order to improve the slow convergence of Back Propagation algorithm and expand the least-squares method for training large-scale neural networks, an integrated training algorithm is proposed. To show the workings, the new algorithm has been implemented and tested on some real-world problems. The experimental results show that our proposed method can converge faster than the original Back Propagation algorithm and some enhanced learning algorithms and can escape from local minimum to get a better convergence error. Also our proposed method can work well for training some large-scale neural networks.
展开▼