首页> 外文期刊>Neurocomputing >A novel online adaptive kernel method with kernel centers determined by a support vector regression approach
【24h】

A novel online adaptive kernel method with kernel centers determined by a support vector regression approach

机译:支持向量回归法确定核中心的新型在线自适应核方法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

The optimality of the kernel number and kernel centers plays a significant role in determining the approximation power of nearly all kernel methods. However, the process of choosing optimal kernels is always formulated as a global optimization task, which is hard to accomplish. Recently, an improved algorithm called recursive reduced least squares support vector regression (IRR-LSSVR) was proposed for establishing a global nonparametric offline model. IRR-LSSVR demonstrates a significant advantage in choosing representing support vectors compared with others. Inspired by the IRR-LSSVR, a new online adaptive parametric kernel method called Weights Varying Least Squares Support Vector Regression (WV-LSSVR) is proposed in this paper using the same type of kernels and the same centers as those used in the IRR-LSSVR. Furthermore, inspired by the multikernel semiparametric support vector regression, the effect of the kernel extension is investigated in a recursive regression framework, and a recursive kernel method called Gaussian Process Kernel Least Squares Support Vector Regression (GPK-LSSVR) is proposed using a compound kernel type which is recommended for Gaussian process regression. Numerical experiments on benchmark data sets confirm the validity and effectiveness of the presented algorithms. The WV-LSSVR algorithm shows higher approximation accuracy than the recursive parametric kernel method using the centers calculated by the k-means clustering approach. The extended recursive kernel method (i.e. GPK-LSSVR) has not shown any advantage in terms of global approximation accuracy when validating the test data set without real-time updates, but it can increase modeling accuracy if real-time identification is involved.
机译:内核数和内核中心的最优性在确定几乎所有内核方法的逼近能力中都起着重要作用。但是,选择最佳内核的过程始终被公式化为全局优化任务,这很难完成。最近,提出了一种称为递归的最小二乘支持向量回归(IRR-LSSVR)的改进算法,用于建立全局非参数离线模型。与其他相比,IRR-LSSVR在选择代表支持向量方面表现出显着的优势。受到IRR-LSSVR的启发,本文提出了一种新的在线自适应参数核方法,称为权重最小二乘支持向量回归(WV-LSSVR),使用的是与IRR-LSSVR相同的核类型和相同的中心。此外,受多核半参数支持向量回归的启发,在递归回归框架中研究了核扩展的影响,并提出了一种使用复合核的递归核方法,称为高斯过程核最小二乘支持向量回归(GPK-LSSVR)。建议用于高斯过程回归的类型。在基准数据集上的数值实验证实了所提出算法的有效性和有效性。与使用k均值聚类方法计算的中心的递归参数核方法相比,WV-LSSVR算法显示出更高的逼近精度。扩展递归核方法(即GPK-LSSVR)在验证没有实时更新的测试数据集时在全局逼近精度方面没有显示任何优势,但是如果涉及实时识别,则可以提高建模精度。

著录项

  • 来源
    《Neurocomputing》 |2014年第26期|111-119|共9页
  • 作者单位

    Delft University of Technology, Control and Simulation Division, Faculty of Aerospace Engineering, Delft 2600GB, The Netherlands;

    Delft University of Technology, Control and Simulation Division, Faculty of Aerospace Engineering, Delft 2600GB, The Netherlands;

    Delft University of Technology, Control and Simulation Division, Faculty of Aerospace Engineering, Delft 2600GB, The Netherlands;

    Delft University of Technology, Control and Simulation Division, Faculty of Aerospace Engineering, Delft 2600GB, The Netherlands;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Support vector machine; Multikernel; Recursive nonlinear identification; Adaptive global model; Kernel basis function;

    机译:支持向量机;多内核;递归非线性辨识;自适应全局模型内核基础函数;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号