首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Simple Proof of Convergence of the SMO Algorithm for Different SVM Variants
【24h】

Simple Proof of Convergence of the SMO Algorithm for Different SVM Variants

机译:不同SVM变体的SMO算法收敛性的简单证明

获取原文
获取原文并翻译 | 示例
       

摘要

In this brief, we give a new proof of the asymptotic convergence of the sequential minimum optimization (SMO) algorithm for both the most violating pair and second order rules to select the pair of coefficients to be updated. The proof is more self-contained, shorter, and simpler than previous ones and has a different flavor, partially building upon Gilbert's original convergence proof of its algorithm to solve the minimum norm problem for convex hulls. It is valid for both support vector classification (SVC) and support vector regression, which are formulated under a general problem that encompasses them. Moreover, this general problem can be further extended to also cover other support vector machines (SVM)-related problems such as $nu$-SVC or one-class SVMs, while the convergence proof of the slight variant of SMO needed for them remains basically unchanged.
机译:在本简介中,我们为顺序冲突最小优化(SMO)算法的渐近收敛性提供了新的证明,该算法适用于最违规的对和二阶规则,以选择要更新的系数对。该证明比以前的证明更完整,更短,更简单,并且具有不同的风格,部分基于吉尔伯特的算法的原始收敛证明来解决凸包的最小范数问题。它对于支持向量分类(SVC)和支持向量回归均有效,这是在包含它们的一般问题下制定的。此外,这个一般问题可以进一步扩展,以涵盖其他与支持向量机(SVM)相关的问题,例如$ nu $ -SVC或一类SVM,而对于它们所需的SMO轻微变化的收敛证明基本上仍然存在不变。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号