首页> 外文会议>IEEE International Conference on Data Mining Workshops >An Extension of Gradient Boosted Decision Tree Incorporating Statistical Tests
【24h】

An Extension of Gradient Boosted Decision Tree Incorporating Statistical Tests

机译:结合统计检验的梯度提升决策树的扩展

获取原文
获取外文期刊封面目录资料

摘要

Gradient boosted decision tree (GBDT) is one of the most popular machine learning methods for many machine learning and data mining tasks. In this paper, we address the overfitting problem of the existing GBDT algorithms, and introduce the idea of statistical significance into tree construction algorithm to reduce it. We propose a new algorithm, W-GBDT, incorporating Welch's t-test as a tree splitting criteria based on the existing XGBoost algorithm. Our experiment results, using real-world datasets, show that our proposed method significantly outperforms the original XGBoost in both the generalization ability and robustness against the number of iterations.
机译:梯度增强决策树(GBDT)是用于许多机器学习和数据挖掘任务的最受欢迎的机器学习方法之一。在本文中,我们解决了现有GBDT算法的过拟合问题,并将统计意义的思想引入到树构造算法中以减少它。我们提出了一种新的算法W-GBDT,该算法结合了Welch的t检验作为基于现有XGBoost算法的树划分标准。我们的实验结果使用现实世界的数据集显示,我们提出的方法在泛化能力和针对迭代次数的鲁棒性方面均明显优于原始XGBoost。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号