...
首页> 外文期刊>Information Fusion >Ensembles of decision trees based on imprecise probabilities and uncertainty measures
【24h】

Ensembles of decision trees based on imprecise probabilities and uncertainty measures

机译:基于不精确概率和不确定性度量的决策树集合

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we present an experimental comparison among different strategies for combining decision trees built by means of imprecise probabilities and uncertainty measures. It has been proven that the combination or fusion of the information obtained from several classifiers can improve the final process of the classification. We use previously developed schemes, known as Bagging and Boosting, along with a new one based on the variation of the root node via the information rank of each feature of the class variable. To this end, we applied two different approaches to deal with missing data and continuous variables. We use a set of tests on the performance of the methods analyzed here, to show that, with the appropriate approach, the Boosting scheme constitutes an excellent way to combine this type of decision tree. It should be noted that it provides good results, even compared with a standard Random Forest classifier, a successful procedure very commonly used in the literature.
机译:在本文中,我们提出了一种通过不精确概率和不确定性度量来组合决策树的不同策略之间的实验比较。已经证明,从多个分类器获得的信息的组合或融合可以改善分类的最终过程。我们使用以前开发的方案(称为Bagging and Boosting)以及基于根节点的变化(通过类变量每个特征的信息等级)的新方案。为此,我们应用了两种不同的方法来处理丢失的数据和连续变量。我们对此处分析的方法的性能进行了一系列测试,以表明采用适当的方法,Boosting方案是组合这种类型的决策树的绝佳方法。应该注意的是,即使与标准随机森林分类器相比,它也提供了良好的结果,这是文献中非常常用的成功程序。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号