首页> 外文期刊>Annals of Mathematics and Artificial Intelligence >Improvement of boosting algorithm by modifying the weighting rule
【24h】

Improvement of boosting algorithm by modifying the weighting rule

机译:通过修改加权规则对提升算法的改进

获取原文
获取原文并翻译 | 示例
           

摘要

AdaBoost is a method for improving the classification accuracy of a given learning algorithm by combining hypotheses created by the learning alogorithms. One of the drawbacks of AdaBoost is that it worsens its performance when training examples include noisy examples or exceptional examples, which are called hard examples. The phenomenon causes that AdaBoost assigns too high weights to hard examples. In this research, we introduce the thresholds into the weighting rule of AdaBoost in order to prevent weights from being assigned too high value. During learning process, we compare the upper bound of the classification error of our method with that of AdaBoost, and we set the thresholds such that the upper bound of our method can be superior to that of AdaBoost. Our method shows better performance than AdaBoost.
机译:AdaBoost是一种通过结合由学习算法创建的假设来提高给定学习算法的分类准确性的方法。 AdaBoost的缺点之一是,当训练示例包括嘈杂的示例或异常示例(称为困难示例)时,它会降低性能。这种现象导致AdaBoost难以为硬示例分配过多的权重。在这项研究中,我们将阈值引入AdaBoost的加权规则中,以防止将权重分配的值过高。在学习过程中,我们将方法的分类误差上限与AdaBoost的上限进行比较,并设置阈值以使我们的方法的上限可以优于AdaBoost的上限。我们的方法显示出比AdaBoost更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号