首页> 外文会议>12th Annual Conference on Computational Learning Theory, Jul 6-9, 1999, Santa Cruz, California >Additive Models, Boosting, and Inference for Generalized Divergences
【24h】

Additive Models, Boosting, and Inference for Generalized Divergences

机译:广义散度的加法模型,增强和推断

获取原文
获取原文并翻译 | 示例

摘要

We present a framework for designing incremental learning algorithms derived from generalized entropy functionals. Our approach is based on the use of Bregman divergences together with the associated class of additive models constructed using the Legendre transform. A particular one-parameter family of Bregman divergences is shown to yield a family of loss functions that includes the log-likelihood criterion of logistic regression as a special case, and that closely approximates the exponential loss criterion used in the AdaBoost algorithms of Schapire et al., as the natural parameter of the family varies. We also show how the quadratic approximation of the gain in Bregman divergence results in a weighted least-squares criterion. This leads to a family of incremental learning algorithms that builds upon and extends the recent interpretation of boosting in terms of additive models proposed by Friedman, Hastie, and Tibshi-rani.
机译:我们提出了一个框架,用于设计从广义熵函数派生的增量学习算法。我们的方法基于Bregman发散的使用以及使用Legendre变换构造的相关附加模型类别。一个特殊的Bregman散度单参数族显示出一个损失函数族,其中包括logistic回归的对数似然标准作为特殊情况,并且非常近似Schapire等人的AdaBoost算法中使用的指数损失标准。 ...,因为家庭的自然参数有所不同。我们还展示了Bregman散度中增益的二次逼近如何导致加权最小二乘准则。这导致了一系列增量学习算法,这些算法以Friedman,Hastie和Tibshi-rani提出的加性模型为基础,并扩展了对Boosting的最新解释。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号