SampleBoost is an intelligent multi-class boosting algorithm that employs an error parameter combined with stratified sampling during training iterations to accommodate multi-class data sets and avoid problems associated with traditional boosting methods. In this paper we investigate the choice of the error parameter along with the preferred sampling sizes for our method. Experimental results show that lower values of the error parameter can lower the performance while larger values lead to satisfactory results. The parameter choice has noticeable effect on low sampling sizes and has less effect on data sets with low number of classes. Varying sampling sizes during training iterations achieves the least variance in the error rates. The results also show the improved performance of SampleBoost compared to other methods.
展开▼