首页> 外文会议>Annual conference on computational learning theory >Multiclass Learning, Boosting, and Error-Correcting Codes
【24h】

Multiclass Learning, Boosting, and Error-Correcting Codes

机译:多牌学习,提升和纠错码

获取原文
获取外文期刊封面目录资料

摘要

We focus on methods to solve multiclass learning problems by using only simple and efficient binary learners. We investigate the approach of Dietterich and Bakiri based on error-correcting codes (which we call ECC). We distill error correlation as one of the key parameters influencing the performance of the ECC approach, and prove upper and lower bounds on the training error of the final hypothesis in terms of the error-correlation between the various binary hypotheses. Boosting is a powerful and well-studied learning technique that appears to annul error correlation disadvantages by cleverly weighting training examples and hypotheses. An interesting algorithm called AdaBoost.OC combines boosting with the ECC approach and gives an algorithm that has the performance advantages of boosting and at the same time relies only on simple binary weak learners. We propose a variant of this algorithm, which we call AdaBoost.ECC, that, by using a different weighting of the votes of the weak hypotheses, is able to improve on the performance of Ada-Boost.OC, both theoretically and experimentally, and in addition is arguably a more direct reduction of multiclass learning to binary learning problems than previous multiclass boosting algorithms.
机译:我们专注于通过仅使用简单高效的二进制学习者解决多字母学习问题的方法。我们根据纠错码(我们呼叫ECC)来调查DIETTERICH和BAKIRI的方法。我们将误差相关性作为影响ECC方法性能的关键参数之一,并且在各种二进制假设之间的误差相关方面证明了最终假设的训练误差上的上限和下限。提升是一种功能强大且研究的学习技术,似乎通过巧妙地加权训练示例和假设来呈现误差相关缺点。一种名为Adaboost.oc的有趣算法,将促进了ECC方法,并提供了一种算法,其具有升高的性能优势,同时仅依赖于简单的二进制弱学习者。我们提出了这种算法的变种,我们称之为adaboost.Ecc,即通过使用弱假设的投票的不同加权,能够改善Ada-Boost.oc的性能,理论上和实验此外,可以说是比以前的多字符促进算法的二进制学习问题更直接地减少多牌子学习问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号