【24h】

Incorporating Bagging into Boosting

机译:将袋装纳入推动

获取原文
获取原文并翻译 | 示例

摘要

In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.
机译:在分类学习中,分类器组学习方法可提供更好的结果来预测准确性。通过重复基于单个的学习算法,可以使用各种分类器来组成一个组。小组成员通过投票进行最终分类。 Boosting和Bagging是该组中的两种流行方法,它们降低了决策树学习的错误率。 Boosting比Bagging更准确,但前者比后者更具可变性。在本文中,我们的目的是回顾不平衡数据集框架中小组学习技术的最新状况。我们提出了一种新的分组学习算法,称为将袋装合并到Boosting中(IB),该算法通过将Bagging合并到Boosting中来创建多个子组。在自然域上的实验结果表明,平均而言,IB比Bagging或Boosting更稳定。它比Boosting更稳定。这些特性使IB成为小组学习技术之一的不错选择。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号