首页> 外文期刊>Soft computing: A fusion of foundations, methodologies and applications >A robust algorithm of support vector regression with a trimmed Huber loss function in the primal
【24h】

A robust algorithm of support vector regression with a trimmed Huber loss function in the primal

机译:一种强大的支持向量回归与原始修剪的Huber损失函数的稳健算法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Support vector machine for regression (SVR) is an efficient tool for solving function estimation problem. However, it is sensitive to outliers due to its unbounded loss function. In order to reduce the effect of outliers, we propose a robust SVR with a trimmed Huber loss function (SVRT) in this paper. Synthetic and benchmark datasets were, respectively, employed to comparatively assess the performance of SVRT, and its results were compared with those of SVR, least squares SVR (LS-SVR) and a weighted LS-SVR. The numerical test shows that when training samples are subject to errors with a normal distribution, SVRT is slightly less accurate than SVR and LS-SVR, yet more accurate than the weighted LS-SVR. However, when training samples are contaminated by outliers, SVRT has a better performance than the other methods. Furthermore, SVRT is faster than the weighted LS-SVR. Simulating eight benchmark datasets shows that SVRT is averagely more accurate than the other methods when sample points are contaminated by outliers. In conclusion, SVRT can be considered as an alternative robust method for simulating contaminated sample points.
机译:支持向量机用于回归(SVR)是解决功能估计问题的有效工具。然而,由于其无限损耗功能,它对异常值敏感。为了降低异常值的效果,我们在本文中提出了一种具有修剪的Huber损耗功能(SVRT)的强大SVR。合成和基准数据集分别用于比较评估SVRT的性能,与SVR,最小二乘SVR(LS-SVR)和加权LS-SVR的结果进行比较。数值测试表明,当训练样本受到正常分布的错误时,SVRT比SVR和LS-SVR略低于SVR和LS-SVR,但比加权LS-SVR更精确。但是,当训练样本被异常值污染时,SVRT具有比其他方法更好的性能。此外,SVRT比加权LS-SVR快。模拟八个基准数据集显示,当样本点被异常值污染时,SVRT比其他方法更准确。总之,SVRT可以被认为是用于模拟受污染的采样点的替代鲁棒方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号