【24h】

Shared Ensemble Learning Using Multi-trees

机译:使用多树的共享乐团学习

获取原文
获取原文并翻译 | 示例

摘要

Decision tree learning is a machine learning technique that allows accurate and comprehensible models to be generated. Accuracy can be improved by ensemble methods which combine the predictions of a set of different trees. However, a large amount of resources is necessary to generate the ensemble. In this paper, we introduce a new ensemble method that minimises the usage of resources by sharing the common parts of the components of the ensemble. For this purpose, we learn a decision multi-tree instead of a decision tree. We call this new approach shared ensembles. The use of a multi-tree produces an exponential number of hypotheses to be combined, which provides better results than boosting /bagging. We performed several experiments, showing that the technique allows us to obtain accurate models and improves the use of resources with respect to classical ensemble methods.
机译:决策树学习是一种机器学习技术,可以生成准确且可理解的模型。可以通过结合一组不同树的预测的集成方法来提高准确性。但是,生成集成需要大量资源。在本文中,我们介绍了一种新的集成方法,该方法通过共享集成组件的公共部分来最大程度地减少资源的使用。为此,我们学习了决策多树而不是决策树。我们称这种新方法为共享乐团。多树的使用产生了要组合的指数数量的假设,这比增强/装袋提供了更好的结果。我们进行了几次实验,表明该技术可以使我们获得准确的模型,并相对于经典的集成方法,可以改进资源的使用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号