首页> 外文会议>14th Australian Joint Conference on Artificial Intelligence, 14th, Dec 10-14, 2001, Adelaide, Australia >Gradient Descent Style Leveraging of Decision Trees and Stumps for Misclassification Cost Performance
【24h】

Gradient Descent Style Leveraging of Decision Trees and Stumps for Misclassification Cost Performance

机译:决策树和树桩的梯度下降样式用于错误分类的成本绩效

获取原文
获取原文并翻译 | 示例

摘要

This paper investigates the use, for the task of classifier learning in the presence of misclassification costs, of some gradient descent style leveraging approaches to classifier learning: Schapire and Singer's AdaBoost.MH and AdaBoost. MR, and Collins et al's multi-class logistic regression method, and some modifications that retain the gradient descent style approach. Decision trees and stumps are used as the underlying base classifiers, learned from modified versions of Quin-lan's C4.5. Experiments are reported comparing the performance, in terms of average cost, of the modified methods to that of the originals, and to the previously suggested "Cost Boosting" methods of Ting and Zheng and Ting, which also use decision trees based upon modified C4.5 code, but do not have an interpretation in the gradient descent framework. While some of the modifications improve upon the originals in terms of cost performance for both trees and stumps, the comparison with tree-based Cost Boosting suggests that out of the methods first experimented with here, it is one based on stumps that has the most promise.
机译:本文研究在存在错误分类成本的情况下用于分类器学习的任务,利用一些梯度下降样式利用分类器学习的方法:Schapire和Singer的AdaBoost.MH和AdaBoost。 MR和Collins等人的多类逻辑回归方法,以及一些保留了梯度下降样式方法的修改。决策树和树桩用作基础基本分类器,是从Quin-lan C4.5的修改版中学到的。据报道,实验比较了改进方法与原始方法的性能,以及与Ting和Zheng和Ting先前建议的“成本提高”方法的性能,该方法也使用基于改进C4的决策树。 5个代码,但在梯度下降框架中没有解释。虽然某些修改在树和树桩的成本性能方面都对原始文档进行了改进,但与基于树的“成本提升”的比较表明,在此处首次尝试的方法中,基于树桩的方法最有希望。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号