首页> 外文期刊>Neurocomputing >A robust weighted least squares support vector regression based on least trimmed squares
【24h】

A robust weighted least squares support vector regression based on least trimmed squares

机译:基于最小修剪平方的稳健加权最小二乘支持向量回归

获取原文
获取原文并翻译 | 示例

摘要

In order to improve the robustness of the classcial LSSVM when dealing with sample points in the presence of outliers, we have developed a robust weighted LSSVM (reweighted LSSVM) based on the least trimmed squares technique (LTS). The procedure of the reweighted LSSVM includes two stages, respectively used to increase the robustness and statistical efficiency of the estimator. In the first stage, LTS-based LSSVM (LSSVM-LTS) with C-steps was adopted to obtain robust simulation results at the cost of losing statistical efficiency to some extent. Thus, in the second stage, the results computed in the first stage were optimized with a weighted LSSVM to improve efficiency. Two groups of examples including numerical tests and real-world benchmark examples were respectively employed to compare the robustness of the reweighted LSSVM with those of the classical LSSVM, the weighted LSSVM and LSSVM-LTS. Numerical tests indicate that the reweighted LSSVM is comparable to the weighted LSSVM, and more accurate than the classical LSSVM and LSSVM-LTS when the contaminating proportion is small (i.e. 0.1 and 0.2), whereas with the increase of contaminating proportion, the reweighted LSSVM performs much better than other methods. The real-world exmaple of regressing seven benchmark datasets demonstrates that the reweighted LSSVM is always more accurate than other versions of LSSVM. In conclusion, the newly developed method can be considered as an alternative to function estimation, especially for sample points in the presence of outliers. (C) 2015 Elsevier B.V. All rights reserved.
机译:为了提高在存在异常值时处理样本点时经典LSSVM的鲁棒性,我们开发了基于最小修剪平方技术(LTS)的鲁棒加权LSSVM(重新加权LSSVM)。重新加权的LSSVM的过程包括两个阶段,分别用于增加估计器的鲁棒性和统计效率。在第一阶段,采用具有C步骤的基于LTS的LSSVM(LSSVM-LTS)来获得鲁棒的仿真结果,但会在一定程度上损失统计效率。因此,在第二阶段,使用加权LSSVM优化在第一阶段中计算的结果以提高效率。分别使用两组示例(包括数值测试和实际基准示例)来比较重新加权的LSSVM与经典LSSVM(加权的LSSVM和LSSVM-LTS)的鲁棒性。数值测试表明,当污染比例较小(即0.1和0.2)时,重加权的LSSVM与加权的LSSVM相当,并且比经典的LSSVM和LSSVM-LTS更准确,而随着污染比例的增加,重加权的LSSVM表现出良好的性能。比其他方法好多了。回归七个基准数据集的真实示例表明,重新加权的LSSVM总是比其他版本的LSSVM更加准确。总之,可以将新开发的方法视为函数估计的替代方法,尤其是对于存在异常值的采样点。 (C)2015 Elsevier B.V.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2015年第30期|941-946|共6页
  • 作者单位

    Shandong Univ Sci & Technol, Shandong Prov & Minist Sci & Technol, State Key Lab Min Disaster Prevent & Control, Qingdao 266590, Peoples R China|Shandong Univ Sci & Technol, Coll Geomat, Qingdao 266590, Peoples R China;

    Shandong Univ Sci & Technol, Dept Informat Engn, Tai An 271019, Shandong, Peoples R China;

    Wuhan Univ, Shool Geodesy & Geomat, Wuhan 430072, Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Outlier; Least squares support vector regression; Least trimmed squares; Robust; Effciency;

    机译:离群值;最小二乘支持向量回归;最小二乘修剪;稳健;效率;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号