...
首页> 外文期刊>International Journal of Pattern Recognition and Artificial Intelligence >TRAINING SUPPORT VECTOR MACHINES USING FRANK-WOLFE OPTIMIZATION METHODS
【24h】

TRAINING SUPPORT VECTOR MACHINES USING FRANK-WOLFE OPTIMIZATION METHODS

机译:使用FRANK-WOLFE优化方法的培训支持矢量机

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Training a support vector machine (SVM) requires the solution of a quadratic programming problem (QP) whose computational complexity becomes prohibitively expensive for large scale datasets. Traditional optimization methods cannot be directly applied in these cases, mainly due to memory restrictions. By adopting a slightly different objective function and under mild conditions on the kernel used within the model, efficient algorithms to train SVMs have been devised under the name of core vector machines (CVMs). This framework exploits the equivalence of the resulting learning problem with the task of building a minimal enclosing ball (MEB) problem in a feature space, where data is implicitly embedded by a kernel function. In this paper, we improve on the CVM approach by proposing two novel methods to build SVMs based on the Frank-Wolfe algorithm, recently revisited as a fast method to approximate the solution of a MEB problem. In contrast to CVMs, our algorithms do not require to compute the solutions of a sequence of increasingly complex QPs and are defined by using only analytic optimization steps. Experiments on a large collection of datasets show that our methods scale better than CVMs in most cases, sometimes at the price of a slightly lower accuracy. As CVMs, the proposed methods can be easily extended to machine learning problems other than binary classification. However, effective classifiers are also obtained using kernels which do not satisfy the condition required by CVMs, and thus our methods can be used for a wider set of problems.
机译:训练支持向量机(SVM)需要解决二次规划问题(QP),对于大规模数据集,其二次计算的复杂性变得过高。主要由于内存限制,传统的优化方法无法直接应用于这些情况。通过在模型中使用的内核上采用略有不同的目标函数并在适度的条件下,以核心向量机(CVM)的名义设计了有效的算法来训练SVM。该框架利用结果学习问题的等效性来完成在特征空间中构建最小封闭球(MEB)问题的任务,在该特征空间中,数据被内核函数隐式嵌入。在本文中,我们通过提出两种基于Frank-Wolfe算法构建SVM的新颖方法,对CVM方法进行了改进,最近又将其作为一种快速的方法来近似解决MEB问题。与CVM相比,我们的算法不需要计算一系列日益复杂的QP的解,并且仅通过使用解析优化步骤进行定义。对大量数据集进行的实验表明,在大多数情况下,我们的方法可比CVM更好地扩展,有时准确性会稍有降低。作为CVM,提出的方法可以轻松地扩展到除二进制分类之外的机器学习问题。但是,使用不满足CVM所需条件的内核也可以获得有效的分类器,因此我们的方法可用于更广泛的问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号