首页> 外文会议>Machine learning and data mining in pattern recognition >ODDboost: Incorporating Posterior Estimates into AdaBoost
【24h】

ODDboost: Incorporating Posterior Estimates into AdaBoost

机译:ODDboost:将后验估计合并到AdaBoost中

获取原文
获取原文并翻译 | 示例

摘要

Boosting methods while being among the best classification methods developed so far, are known to degrade performance in case of noisy data and overlapping classes. In this paper we propose a new upper generalization bound for weighted averages of hypotheses, which uses posterior estimates for training objects and is based on reduction of binary classification problem with overlapping classes to a deterministic problem. If we are given accurate posterior estimates, proposed bound is lower than existing bound by Schapire et al [25]. We design an AdaBoost-like algorithm which optimizes proposed generalization bound and show that incorporated with good posterior estimates it performs better than the standard AdaBoost on real-world data sets.
机译:提升方法虽然是迄今为止开发的最好的分类方法之一,但已知在噪声数据和类重叠的情况下会降低性能。在本文中,我们提出了一种新的假设加权平均值的上界,它使用后验估计训练对象,并且是基于将具有重叠类的二元分类问题简化为确定性问题的。如果给定准确的后验估计,则建议的界线将小于Schapire等人[25]的现有界线。我们设计了一种类似于AdaBoost的算法,该算法优化了建议的泛化边界,并表明在具有良好的后验估计的情况下,该算法在实际数据集上的性能优于标准AdaBoost。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号