首页> 外文会议>International Conference on Algorithmic Learning Theory(ALT 2007); 20071001-04; Sandai(JP) >Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability
【24h】

Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability

机译:类概率收缩估计器的多类提升算法

获取原文
获取原文并翻译 | 示例

摘要

Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.
机译:我们的目的是估计多类别分类问题中输出标签的条件概率。 Adaboost提供了高度准确的分类器,并具有估计条件概率的潜力。但是,由Adaboost估算的条件概率往往过度适合训练样本。我们提出了用于损失的函数以提供收缩估计。正则化的效果是通过将概率朝着均匀分布的方向缩小来实现的。数值实验表明,基于提出的损失函数的增强算法与现有的条件概率估计算法相比,具有明显更好的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号