首页> 外文会议>International Conference on Neural Information Processing >Stochastic Sequential Minimal Optimization for Large-Scale Linear SVM
【24h】

Stochastic Sequential Minimal Optimization for Large-Scale Linear SVM

机译:大型线性SVM的随机顺序最小优化

获取原文

摘要

Linear support vector machine (SVM) is a popular tool in machine learning. Compared with nonlinear SVM, linear SVM produce competent performances, and is more efficient in tacking larg-scale and high dimensional tasks. In order to speed up its training, various algorithms have been developed, such as Liblinear, SVM-perf and Pegasos. In this paper, we propose a new fast algorithm for linear SVMs. This algorithm uses the stochastic sequence minimization optimization (SSMO) method. There are two main differences between our algorithm and other linear SVM algorithms. Our algorithm updates two variables, simultaneously, rather than updating a single variable. We maintain the bias term b in discriminant functions. Experiments indicate that the proposed algorithm is much faster than some state of the art solvers, such as Liblinear, and achieves higher classification accuracy.
机译:线性支持向量机(SVM)是机器学习中的流行工具。与非线性SVM相比,线性SVM产生称重的性能,并且在加上盗版和高维任务方面更有效。为了加速其培训,已经开发了各种算法,例如Liblinear,SVM-Perf和Pegasos。在本文中,我们提出了一种新的线性SVMS快速算法。该算法使用随机序列最小化优化(SSMO)方法。我们的算法和其他线性SVM算法之间存在两个主要差异。我们的算法同时更新两个变量,而不是更新单个变量。我们以判别函数维持偏置术语B.实验表明,该算法的算法比某些状态求解得多,例如Liblinear,并且达到更高的分类精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号