首页> 美国政府科技报告 >Randomized Ensemble Methods for Classification Trees
【24h】

Randomized Ensemble Methods for Classification Trees

机译:分类树的随机集合方法

获取原文

摘要

We propose two methods of constructing ensembles of classifiers, One method directly injects randomness into classification tree algorithms by choosing a split randomly at each node with probabilities proportional to the measure of goodness for a split We combine this method with a stopping rule which uses permutation of the outputs The other method perturbs the output and constructs a classifier using the perturbed data, In both methods, the final classifier is given by an unweighted vote of the individual classifiers, These methods are compared with bagging, Adaboost, and random forests on thirteen commonly used data sets, The results show that our methods perform better than bagging, and comparably to Adaboost and random forests on average, Additional computation shows that our perturbation method could improve its performance by perturbing both the inputs and with the outputs, and combining a sufficiently large number of trees, Plots of strength and correlation show an interesting relationship, We also explore combining sampling subsets of the training set with our proposed methods, The results of a few trials show that the performance of our proposed methods could be improved by combining sampling subsets of the training set.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号