...
首页> 外文期刊>International Scholarly Research Notices >Online Boosting Algorithm Based on Two-Phase SVM Training
【24h】

Online Boosting Algorithm Based on Two-Phase SVM Training

机译:基于两阶段SVM训练的在线Boosting算法

获取原文

摘要

We describe and analyze a simple and effective two-step online boosting algorithm that allows us to utilize highly effective gradient descent-based methods developed for online SVM training without the need to fine-tune the kernel parameters, and we show its efficiency by several experiments. Our method is similar to AdaBoost in that it trains additional classifiers according to the weights provided by previously trained classifiers, but unlike AdaBoost, we utilize hinge-loss rather than exponential loss and modify algorithm for the online setting, allowing for varying number of classifiers. We show that our theoretical convergence bounds are similar to those of earlier algorithms, while allowing for greater flexibility. Our approach may also easily incorporate additional nonlinearity in form of Mercer kernels, although our experiments show that this is not necessary for most situations. The pre-training of the additional classifiers in our algorithms allows for greater accuracy while reducing the times associated with usual kernel-based approaches. We compare our algorithm to other online training algorithms, and we show, that for most cases with unknown kernel parameters, our algorithm outperforms other algorithms both in runtime and convergence speed.
机译:我们描述并分析了一种简单有效的两步在线升压算法,该算法使我们能够利用为在线SVM训练开发的高效基于梯度下降的方法,而无需微调内核参数,并且通过多次实验证明了其效率。我们的方法类似于AdaBoost,它根据先前训练过的分类器提供的权重来训练其他分类器,但是与AdaBoost不同,我们使用铰链损耗而不是指数损失并针对在线设置修改算法,从而允许使用不同数量的分类器。我们表明,我们的理论收敛范围与早期算法的相似,但具有更大的灵活性。我们的方法还可以轻松地以Mercer内核的形式合并其他非线性,尽管我们的实验表明,在大多数情况下这不是必需的。在我们的算法中对其他分类器进行预训练可以提高准确性,同时减少与基于内核的常规方法相关的时间。我们将我们的算法与其他在线训练算法进行了比较,结果表明,对于大多数内核参数未知的情况,我们的算法在运行时间和收敛速度上均优于其他算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号