【24h】

Incorporating Posterior Estimates into AdaBoosti

机译:将后验估计合并到AdaBoosti中

获取原文
获取原文并翻译 | 示例
           

摘要

Although boosting methods [9, 23] for creating compositions of weak hypotheses are among thebest methods of machine learning developed so far [4], they are known to degrade performance in case ofnoisy data and overlapping classes. In this paper we consider binary classification and propose a reduction ofoverlapping classes' classification problem to a deterministic problem. We also devise a new upper generali-zation bound for weighted averages of weak hypotheses, which uses posterior estimates for training objectsand is based on proposed reduction. If we are given accurate posterior estimates, this bound is lower thanexisting bound by Schapire et al. [22]. We design an AdaBoost-like algorithm which optimizes proposed gen-eralization bound and show that when incorporated with good posterior estimates it performs better than thestandard AdaBoost on real-world data sets.
机译:尽管到目前为止,用于开发弱假设的组合的增强方法[9,23]是迄今为止开发的最佳机器学习方法之一[4],但已知在噪声数据和类重叠的情况下,它们会降低性能。在本文中,我们考虑二进制分类,并提出将重叠类的分类问题简化为确定性问题。我们还为弱假设的加权平均值设计了一个新的上泛化边界,该边界使用后验估计作为训练对象,并且基于拟议的约简。如果我们得到准确的后验估计,则该界限低于Schapire等人的现有界限。 [22]。我们设计了一种类似于AdaBoost的算法,该算法可优化建议的泛化范围,并表明在与良好的后验估计结合使用时,它在现实数据集上的性能优于标准AdaBoost。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号