In an earlier paper, we introduced an ensemble model called ESGNN (ensemble self-generating neural network) which can be used to reduce the error for classification and chaotic time series prediction. Although this model can obtain the high accuracy than a single SGNN, the computational cost increase in proportion to the number of SGNN in an ensemble. In this paper, we propose a new pruning SGNN algorithm to reduce the memory requirement for classification. We compared ESGNN with nearest neighbor classifier using a collection of machine-learning benchmarks. Experimental results show that our method could reduce the memory requirement and improve the accuracy over the nearest neighbor classifier's accuracy.
展开▼