首页> 外文期刊>Machine Learning >An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization
【24h】

An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization

机译:三种构建决策树集合的方法的实验比较:装袋,提升和随机化

获取原文
获取原文并翻译 | 示例
           

摘要

Bagging and boosting are methods that generate a diverse ensemble of classifiers by manipulating the training data given to a "base" learning algorithm. Breiman has pointed out that they rely for their effectiveness on the instability of the base learning algorithm. An alternative approach to generating an ensemble is to randomize the internal decisions made by the base algorithm. This general approach has been studied previously by Ali and Pazzani and by Dietterich and Kong.
机译:套袋和增强是通过操纵提供给“基础”学习算法的训练数据来生成分类器的不同集合的方法。 Breiman指出,它们的有效性取决于基础学习算法的不稳定性。生成集合的另一种方法是随机化基本算法做出的内部决策。 Ali和Pazzani以及Dietterich和Kong之前已经研究过这种通用方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号