...
首页> 外文期刊>Communications in Statistics >Iteratively reweighted least square for asymmetric L_2-Loss support vector regression
【24h】

Iteratively reweighted least square for asymmetric L_2-Loss support vector regression

机译:用于不对称L_2损失支持向量回归的迭代重新超强正方形

获取原文
获取原文并翻译 | 示例

摘要

In support vector regression (SVR) model, using the squared -insensitive loss function makes the objective function of the optimization problem strictly convex and yields a more concise solution. However, the formulation leads to a quadratic programing which is expensive to solve. This paper reformulates the optimization problem by absorbing the constraints in the objective function, and the new formulation shares similarity with weighted least square regression problem. Based on this formulation, we propose an iteratively reweighted least square approach to train the L 2-loss SVR, for both linear and nonlinear models. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency.
机译:在支持向量回归(SVR)模型中,使用平方 - 敏感损耗功能使得优化问题的目标函数严格凸出并产生更简洁的解决方案。 然而,该配方导致了昂贵的可以解决的二次课程。 本文通过吸收目标函数中的约束来重构优化问题,以及新的配方与加权最小二乘回归问题的相似性。 基于该配方,我们提出了一种迭代重复的最小二乘方法来训练L 2损耗SVR,用于线性和非线性模型。 所提出的方法易于实现,而不需要除基本线性代数操作以外的任何额外计算包。 现实世界数据集的数值研究表明,与替代方案相比,所提出的方法可以实现类似的预测精度,其时间效率显着增加。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号