首页> 外文会议>International Conference on Neural Information Processing >A Novel Sequential Minimal Optimization Algorithm for Support Vector Regression
【24h】

A Novel Sequential Minimal Optimization Algorithm for Support Vector Regression

机译:一种用于支持向量回归的新型序贯最小优化算法

获取原文

摘要

A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.
机译:提出了一种用于支持向量回归的新型序贯最小优化(SMO)算法。该算法基于剥落和劳伦斯的SMO,其中LA变量的凸优化问题被解决而不是标准二次编程问题,而L是L是训练样本的数量,但工作集选择的策略是完全不同的。实验结果表明,该算法的算法比剥落和劳伦斯的SMO快得多,并且与最快的传统界面相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号