首页> 外文会议>International Conference on Intelligent Computation Technology and Automation >Improved Sparse Least Square Support Vector Machines for the Function Estimation
【24h】

Improved Sparse Least Square Support Vector Machines for the Function Estimation

机译:改进的漏水最小二乘支持向量机用于功能估计

获取原文

摘要

Least square support vector machines (LS-SVMs) are deemed good methods for classification and function estimation. Compared with the standard support vector machines (SVMs), a drawback is that the sparseness is lost in the LS-SVMs. The sparseness is imposed by omitting the less important data in the training process, and retraining the remaining data. lterative retraining requires more intensive computations than training the non-sparse LS-SVMs. In this paper, we will describe a new pruning algorithm for LS-SVMs: the width of e-insensitive zone is introduced in the process of the training; in addition, the amount of the pruning points is adjusted according to the performance of training, not specified using the fixed percentage of training data; furthermore, the cross training is applied in the training. The performance of improved LS-SVMs pruning algorithm, in terms of computational cost and regress accuracy, is shown by several experiments based on the same data sets of chaotic time series.
机译:最小二乘支持向量机(LS-SVM)被视为分类和功能估计的良好方法。与标准支持向量机(SVM)相比,缺点是在LS-SVM中丢失稀疏性。通过省略训练过程中不太重要的数据,并再培训其余数据来施加稀疏性。液相回收需要比训练非稀疏LS-SVM更强烈的计算。在本文中,我们将描述LS-SVM的新修剪算法:在训练过程中引入了电子不敏感区域的宽度;此外,根据培训的性能调整修剪点的量,未使用固定培训数据的固定百分比指定;此外,交叉训练适用于培训。基于相同的混沌时间序列的相同数据集,通过多个实验显示了改进的LS-SVM修剪算法的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号