【24h】

Ensembles of Multi-Objective Decision Trees

机译:多目标决策树的集合

获取原文
获取原文并翻译 | 示例

摘要

Ensemble methods are able to improve the predictive performance of many base classifiers. Up till now, they have been applied to classifiers that predict a single target attribute. Given the non-trivial interactions that may occur among the different targets in multi-objective prediction tasks, it is unclear whether ensemble methods also improve the performance in this setting. In this paper, we consider two ensemble learning techniques, bagging and random forests, and apply them to multi-objective decision trees (MODTs), which are decision trees that predict multiple target attributes at once. We empirically investigate the performance of ensembles of MODTs. Our most important conclusions are: (1) ensembles of MODTs yield better predictive performance than MODTs, and (2) ensembles of MODTs are equally good, or better than ensembles of single-objective decision trees, I.e., a set of ensembles for each target. Moreover, ensembles of MODTs have smaller model size and are faster to learn than ensembles of single-objective decision trees.
机译:集成方法能够提高许多基本分类器的预测性能。到目前为止,它们已应用于预测单个目标属性的分类器。鉴于在多目标预测任务中不同目标之间可能发生非平凡的交​​互作用,因此尚不清楚集成方法是否也可以提高这种设置的性能。在本文中,我们考虑了两种集成学习技术:袋装和随机森林,并将它们应用于多目标决策树(MODT),这是一次可以预测多个目标属性的决策树。我们根据经验研究了MODT的合奏性能。我们最重要的结论是:(1)MODT的集合比MODT产生更好的预测性能,并且(2)MODT的集合同样好,甚至比单目标决策树的集合(即,每个目标的集合都更好) 。而且,与单目标决策树的集成相比,MODT的集成具有较小的模型大小并且学习速度更快。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号