...
首页> 外文期刊>Statistics and computing >Stochastic boosting algorithms
【24h】

Stochastic boosting algorithms

机译:随机增强算法

获取原文
获取原文并翻译 | 示例

摘要

In this article we develop a class of stochastic boosting (SB) algorithms, which build upon the work of Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007). They introduce boosting algorithms which correspond to standard boosting (e.g. Buhlmann and Hothorn, Stat. Sci. 22:477-505, 2007) except that the optimization algorithms are randomized; this idea is placed within a Bayesian framework. We show that the inferential procedure in Holmes and Pintore (Bayesian Stat. 8, Oxford University Press, Oxford, 2007) is incorrect and further develop interpretational, computational and theoretical results which allow one to assess SB's potential for classification and regression problems. To use SB, sequential Monte Carlo (SMC) methods are applied. As a result, it is found that SB can provide better predictions for classification problems than the corresponding boosting algorithm. A theoretical result is also given, which shows that the predictions of SB are not significantly worse than boosting, when the latter provides the best prediction. We also investigate the method on a real case study from machine learning.
机译:在本文中,我们基于Holmes和Pintore的工作(Bayesian Stat。8,牛津大学出版社,牛津,2007年)开发了一类随机提升(SB)算法。他们引入了与标准增强相对应的增强算法(例如Buhlmann和Hothorn,Stat。Sci。22:477-505,2007),只是优化算法是随机的;这个想法放在贝叶斯框架内。我们表明,在Holmes和Pintore中进行推理的过程(贝叶斯统计,牛津大学出版社,牛津,2007年,贝叶斯统计,8)是不正确的,并且进一步发展了解释性,计算性和理论性的结果,使人们能够评估SB的分类和回归问题的潜力。为了使用SB,应用了顺序蒙特卡罗(SMC)方法。结果发现,与相应的提升算法相比,SB可以为分类问题提供更好的预测。还给出了理论结果,该结果表明,当升压提供最佳预测时,SB的预测不会比升压明显差。我们还将在机器学习的真实案例研究中研究该方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号