首页> 外文期刊>Neural processing letters >An Efficient SMO Algorithm for Solving Non-smooth Problem Arising in ε-Insensitive Support Vector Regression
【24h】

An Efficient SMO Algorithm for Solving Non-smooth Problem Arising in ε-Insensitive Support Vector Regression

机译:一种高效的SMO算法,用于解决ε - 不敏感的支持向量回归中产生的非平滑问题

获取原文
获取原文并翻译 | 示例
           

摘要

Classical support vector regression (C- SVR) is a powerful function approximation method, which is robust against noise and performs a good generalization, since it is formulated by a regularized error function employing the e- insensitiveness property. To exploit the kernel trick, C- SVR generally solves the Lagrangian dual problem. In this paper, an efficient sequential minimal optimization (SMO) algorithm with a novel easy to compute working set selection (WSS) based on the minimization of an upper bound on the difference between consecutive loss function values for solving a convex non- smooth dual optimization problem obtained by reformulating the dual problem of C- SVR with l2 error loss function which is equivalent to the e- insensitive version of the LSSVR, is proposed. The asymptotic convergence to the optimum of the proposed SMO algorithm is also proved. This proposed SMO algorithm for solving non- smooth problem comprises both SMO algorithms for solving LSSVR and C- SVR. Indeed, it becomes equivalent to the SMO algorithm with second- order WSS for solving LSSVR when e = 0. The proposed algorithm has the advantage of dealing with the optimization variables half the number of the ones in C- SVR, which results in lesser number of kernel related matrix evaluations than the standard SMO algorithm developed for C- SVR and improves the probability of the matrix outputs to have been precomputed and cached. Therefore, the proposed SMO algorithm results better training time than the standard SMO algorithm for solving C- SVR, especially with caching process. Moreover, the superiority of the proposed WSS over its first- order counterpart for solving the non- smooth optimization problem is presented.
机译:经典支持向量回归(C-SVR)是一种强大的函数近似方法,其对噪声具有鲁棒性,并且执行良好的概率,因为它由采用电子不敏感性属性的正则化误差函数制定。要利用内核技巧,C-SVR通常解决了拉格朗日的双重问题。在本文中,基于用于解决凸面非平滑双优化的连续损耗函数值之间的差异上的最小化的最小化来计算工作集选择(WSS)的新颖性序列最小优化(SMO)算法。提出了用L2误差损失函数的C-SVR与LSSVR的电子不敏感版本的误差函数的双问题获得的问题。还证明了所提出的SMO算法最佳的渐近收敛。这提出了用于解决非平滑问题的SMO算法包括用于求解LSSVR和C-SVR的SMO算法。实际上,当E = 0时,它变得等同于具有第二阶WSS的SMO算法,用于求解LSSVR。该算法的优点是处理C-SVR中的优化变量的一半的优势,这导致较小的数量内核相关矩阵评估比为C-SVR开发的标准SMO算法,提高了矩阵输出的概率,以预先计算并缓存。因此,所提出的SMO算法导致比标准SMO算法更好地训练时间,用于解决C-SVR,尤其是通过缓存过程。此外,介绍了所提出的WSS在其用于解决非平滑优化问题的第一顺序对应物的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号