首页> 外文会议>2010 International Conference on Intelligent Computation Technology and Automation >Improved Sparse Least Square Support Vector Machines for the Function Estimation
【24h】

Improved Sparse Least Square Support Vector Machines for the Function Estimation

机译:用于函数估计的改进的稀疏最小二乘支持向量机

获取原文

摘要

Least square support vector machines (LS-SVMs) are deemed good methods for classification and function estimation. Compared with the standard support vector machines (SVMs), a drawback is that the sparseness is lost in the LS-SVMs. The sparseness is imposed by omitting the less important data in the training process, and retraining the remaining data. lterative retraining requires more intensive computations than training the non-sparse LS-SVMs. In this paper, we will describe a new pruning algorithm for LS-SVMs: the width of e-insensitive zone is introduced in the process of the training; in addition, the amount of the pruning points is adjusted according to the performance of training, not specified using the fixed percentage of training data; furthermore, the cross training is applied in the training. The performance of improved LS-SVMs pruning algorithm, in terms of computational cost and regress accuracy, is shown by several experiments based on the same data sets of chaotic time series.
机译:最小二乘支持向量机(LS-SVM)被认为是分类和功能估计的好方法。与标准支持向量机(SVM)相比,缺点是LS-SVM失去了稀疏性。稀疏是通过在训练过程中忽略不太重要的数据并重新训练其余数据来实现的。与训练非稀疏LS-SVM相比,再次训练需要更密集的计算。在本文中,我们将描述一种针对LS-SVM的新修剪算法:在训练过程中引入电子不敏感区域的宽度;另外,修剪枝的数量根据训练的效果进行调整,未使用固定百分比的训练数据进行指定;此外,交叉训练被应用在训练中。改进的LS-SVM修剪算法在计算成本和回归精度方面的性能通过基于相同混沌时间序列数据集的几次实验得到了证明。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号