首页> 外文期刊>Neural processing letters >First and Second Order SMO Algorithms for LS-SVM Classifiers
【24h】

First and Second Order SMO Algorithms for LS-SVM Classifiers

机译:LS-SVM分类器的一阶和二阶SMO算法

获取原文
获取原文并翻译 | 示例

摘要

Least squares support vector machine (LS-SVM) classifiers have been traditionally trained with conjugate gradient algorithms. In this work, completing the study by Keerthi et al., we explore the applicability of the SMO algorithm for solving the LS-SVM problem, by comparing First Order and Second Order working set selections concentrating on the RBF kernel, which is the most usual choice in practice. It turns out that, considering all the range of possible values of the hyperparameters, Second Order working set selection is altogether more convenient than First Order. In any case, whichever the selection scheme is, the number of kernel operations performed by SMO appears to scale quadratically with the number of patterns. Moreover, asymptotic convergence to the optimum is proved and the rate of convergence is shown to be linear for both selections.
机译:最小二乘支持向量机(LS-SVM)分类器传统上已使用共轭梯度算法进行训练。在这项工作中,完成了Keerthi等人的研究,通过比较专注于RBF内核的一阶和二阶工作集选择,我们探索了SMO算法解决LS-SVM问题的适用性。实践中的选择。事实证明,考虑到超参数的所有可能值范围,二阶工作集的选择比一阶更方便。在任何情况下,无论选择哪种方案,SMO执行的内核操作的数量似乎都与模式的数量成平方比例。此外,证明了渐近收敛到最优值,并且对于两种选择,收敛速度都显示为线性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号