The authors examined how a limited number of training patterns can be used to improve the generalization ability of a backpropagation neural network (BPNN). First, they explain the problem with the conventional learning technique, in which only the mean summed squared error (MSSE) is observed as a BPNN learning stopping criterion. The proposed well-balanced learning (WBL) technique observes not only the MSSE, but also the individual summed squared errors of the training patterns. A BPNN is thereby trained with a smaller deviation than in conventional learning, thus improving the network's generalization ability. The effectiveness of WBL is shown by evaluation experiments.
展开▼