首页> 外文会议>International Conference on Intelligent Data Analysis >An Empirical Comparison of Pruning Methods for Ensemble Classifiers
【24h】

An Empirical Comparison of Pruning Methods for Ensemble Classifiers

机译:合奏分类器修剪方法的经验比较

获取原文

摘要

Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classifiers of an ensemble does not necessarily lead to improved generalisation. Examples of individual tree pruning methods are Minimum Error Pruning (MEP), Error-based Pruning (EBP), Reduced-Error Pruning(REP), Critical Value Pruning (CVP) and Cost-Complexity Pruning (CCP). In this paper, we report the results of applying Boosting and Bagging with these five pruning methods to eleven datasets.
机译:许多研究人员表明,诸如提升和装袋等集合方法提高了分类的准确性。升压和装袋与不稳定的学习算法(如神经网络或决策树)表现良好。修剪决策树分类器旨在使树木更简单,更可理解并避免过度拟合。然而,众所周知,综合的各个分类器不一定导致改善的泛化。单个树修剪方法的示例是最小误差修剪(MEP),基于误差的修剪(EBP),减少误差修剪(REP),临界值修剪(CVP)和成本复杂度修剪(CCP)。在本文中,我们报告了使用这五种修剪方法施加升压和袋装的结果,以11个数据集。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号