首页> 外文会议>International Conference on Machine Learning and Cybernetics >Single sequential minimal optimization: an improved SVMs training algorithm
【24h】

Single sequential minimal optimization: an improved SVMs training algorithm

机译:单顺序列最小优化:改进的SVMS训练算法

获取原文
获取外文期刊封面目录资料

摘要

We introduce homogeneous coordinates to represent support vector machines (SVMs) and develop a corresponding training algorithm: single sequential minimal optimization (SSMO). By this simple trick (homogeneous coordinates representation), linear constrains will not appear in quadratic programming (QP) optimization problem. So unlike the most popular used SVM training algorithm sequential minimal optimization (SMO) which solves the QP subproblem containing minimal two Lagrange multipliers, SSMO can analytically update only one Lagrange multiplier at every step. Because of avoiding double loops in heuristically choosing the two Lagrange multipliers in SMO, both CPU time and iterations can be decreased greatly. Experiments on MNIST database, under mild KKT conditions accuracy requirement, shows SSMO can be more than 2 times faster than SMO.
机译:我们介绍同质坐标以代表支持向量机(SVM)并开发相应的训练算法:单顺序最小优化(SSMO)。通过这个简单的技巧(同质坐标表示),线性约束不会出现在二次编程(QP)优化问题中。因此,与最受欢迎的SVM训练算法不同,解决了包含最小两个拉格朗日乘数的QP子问题的最多使用的SVM训练算法(SMO),SSMO可以在每个步骤中分析仅更新一个拉格朗日乘数。由于避免了在SMO中的两个拉格朗日乘法器中避免了双重环路,因此CPU时间和迭代都可以大大减少。 Mnist数据库的实验,根据轻度KKT条件的准确性要求,显示SSMO可以比SMO快2倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号