We propose a boosting algorithm based on AdaBoost for using real-valued weak hypotheses that return confidences of their classifications as real numbers with an approximated upper bound of the training error. The approximated upper bound is induced with Bernoulli's inequality and the upper bound enables us to analytically calculate a confidence-value that satisfies a reduction in the original upper bound. The experimental results on the Reuters-21578 data set and an Amazon review data show that our boosting algorithm with the perceptron attains better accuracy than Support Vector Machines, decision stumps-based boosting algorithms and a perceptron.
展开▼