首页> 外文期刊>Pattern recognition letters >RotBoost: A technique for combining Rotation Forest and AdaBoost
【24h】

RotBoost: A technique for combining Rotation Forest and AdaBoost

机译:RotBoost:一种将Rotation Forest和AdaBoost相结合的技术

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower prediction error than either Rotation Forest or AdaBoost more often than the reverse. Meanwhile, RotBoost is found to perform much better than Bagging and MultiBoost. Through employing the bias and variance decompositions of error to gain more insight of the considered classification methods, RotBoost is seen to simultaneously reduce the bias and variance terms of a single tree and the decrement achieved by it is much greater than that done by the other ensemble methods, which leads RotBoost to perform best among the considered classification procedures. Furthermore, RotBoost has a potential advantage over AdaBoost of suiting parallel execution.
机译:本文提出了一种新的集成分类器生成技术RotBoost,它是由Rotation Forest和AdaBoost组合而成的。使用UCI资料库中的36个现实世界数据集进行的实验(其中采用分类树作为基础学习算法)表明,RotBoost生成的集成分类器的预测误差比Rotation Forest或AdaBoost的发生频率要低得多。相反。同时,发现RotBoost的性能比Bagging和MultiBoost更好。通过使用误差的偏差和方差分解来获得对所考虑的分类方法的更多了解,RotBoost可以同时减少单个树的偏差和方差项,并且该方法所实现的减量远大于其他集合的减量。方法,使RotBoost在考虑的分类程序中表现最佳。此外,与并行处理相比,RotBoost具有优于AdaBoost的潜在优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号