首页> 外文期刊>Artificial Intelligence Review: An International Science and Engineering Journal >Combining bagging, boosting, rotation forest and random subspace methods
【24h】

Combining bagging, boosting, rotation forest and random subspace methods

机译:结合套袋,增强,旋转森林和随机子空间方法

获取原文
获取原文并翻译 | 示例
           

摘要

Bagging, boosting, rotation forest and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-classifiers. Boosting and rotation forest algorithms are considered stronger than bagging and random subspace methods on noise-free data. However, there are strong empirical indications that bagging and random subspace methods are much more robust than boosting and rotation forest in noisy settings. For this reason, in this work we built an ensemble of bagging, boosting, rotation forest and random subspace methods ensembles with 6 sub-classifiers in each one and then a voting methodology is used for the final prediction. We performed a comparison with simple bagging, boosting, rotation forest and random subspace methods ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique had better accuracy in most cases.
机译:套袋,增强,旋转森林和随机子空间方法是众所周知的重采样集成方法,这些方法使用相同的学习算法为基础分类器生成并组合了多样性的学习者。对于无噪声数据,增强和旋转森林算法被认为比装袋和随机子空间方法更强大。但是,有很强的经验表明,在嘈杂的环境中,装袋法和随机子空间法比增强林和旋转林强得多。因此,在这项工作中,我们建立了一个套袋,提升,旋转森林和随机子空间方法的集合,每个方法中均包含6个子分类器,然后使用投票方法进行最终预测。在标准基准数据集上,我们使用了简单的套袋,增强,旋转森林和具有25个子分类器的随机子空间方法以及其他众所周知的合并方法进行了比较,并且所提出的技术在大多数情况下具有更好的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号