首页> 外文期刊>Pattern Analysis and Applications >Proximal gradient method for huberized support vector machine
【24h】

Proximal gradient method for huberized support vector machine

机译:哈伯尔支持向量机的近梯度法

获取原文
获取原文并翻译 | 示例

摘要

The support vector machine (SVM) has been used in a wide variety of classification problems. The original SVM uses the hinge loss function, which is non-differentiable and makes the problem difficult to solve in particular for regularized SVMs, such as with l(1)-regularization. This paper considers the Huberized SVM (HSVM), which uses a differentiable approximation of the hinge loss function. We first explore the use of the proximal gradient (PG) method to solving binary-class HSVM (B-HSVM) and then generalize it to multi-class HSVM (M-HSVM). Under strong convexity assumptions, we show that our algorithm converges linearly. In addition, we give a finite convergence result about the support of the solution, based on which we further accelerate the algorithm by a two-stage method. We present extensive numerical experiments on both synthetic and real datasets which demonstrate the superiority of our methods over some state-of-the-art methods for both binary-and multi-class SVMs.
机译:支持向量机(SVM)已用于各种各样的分类问题。原始SVM使用铰链损耗函数,该函数是不可微的,尤其是对于正则化SVM(例如具有l(1)-正则化)的问题,很难解决。本文考虑了Huberized SVM(HSVM),它使用铰链损耗函数的可微近似。我们首先探索使用近端梯度(PG)方法求解二进制类HSVM(B-HSVM),然后将其推广到多类HSVM(M-HSVM)。在强凸假设下,我们证明了我们的算法是线性收敛的。此外,我们给出了关于该解决方案支持的有限收敛结果,在此基础上,我们通过两步法进一步加速了该算法。我们在合成数据集和真实数据集上都进行了广泛的数值实验,证明了我们的方法优于二进制和多类SVM的某些最新方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号