首页> 外文期刊>Pattern recognition letters >Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework
【24h】

Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework

机译:在margin框架下用二进制基础学习器统一多类AdaBoost算法

获取原文
获取原文并翻译 | 示例

摘要

Multi-class AdaBoost algorithms AdaBooost.MO, -ECC and -OC have received a great attention in the literature, but their relationships have not been fully examined to date. In this paper, we present a novel interpretation of the three algorithms, by showing that MO and ECC perform stage-wise functional gradient descent on a cost function defined over margin values, and that OC is a shrinkage version of ECC. This allows us to strictly explain the properties of ECC and OC, empirically observed in prior work. Also, the outlined interpretation leads us to introduce shrinkage as regularization in MO and ECC, and thus to derive two new algorithms: SMO and SECC. Experiments on diverse databases are performed. The results demonstrate the effectiveness of the proposed algorithms and validate our theoretical findings.
机译:多类AdaBoost算法AdaBooost.MO,-ECC和-OC在文献中受到了极大的关注,但是到目前为止,它们之间的关系尚未得到充分检查。在本文中,我们通过展示MO和ECC对在裕度值上定义的成本函数执行阶段功能梯度下降,以及OC是ECC的缩小版,提出了这三种算法的新颖解释。这使我们能够严格解释在先前工作中根据经验观察到的ECC和OC的属性。同样,概述的解释使我们在MO和ECC中引入收缩作为正则化,从而得出两个新算法:SMO和SECC。在各种数据库上进行实验。结果证明了所提出算法的有效性,并验证了我们的理论发现。

著录项

相似文献

  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号