...
首页> 外文期刊>Neural Computing & Applications >A method to sparsify the solution of support vector regression
【24h】

A method to sparsify the solution of support vector regression

机译:一种稀疏支持向量回归解的方法

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Although the solution of support vector machine is relatively sparse, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set. In this paper, we present a simple post-processing method to sparsify the solution of support vector regression (SVR). The main idea is as follows: first, we train a SVR machine on the full training set; then another SVR machine is trained only on a subset of the full training set with modified target values. This process is done several times iteratively. Experiments indicate that the proposed method can reduce the support vectors greatly while maintaining the good generalization capacity of SVR. Keywords Support vector regression (SVR) - Sparseness - RVM - Adaptive sparse supervised learning (ASSL) - KLASSO
机译:尽管支持向量机的解决方案相对稀疏,但由于支持向量的数量通常随训练集的大小线性增长,因此不必要地自由使用了基函数。在本文中,我们提出了一种简单的后处理方法来稀疏支持向量回归(SVR)的解决方案。主要思想如下:首先,我们在完整的训练设备上训练一台SVR机器。然后仅在具有修改后的目标值的完整训练集中的子集上训练另一台SVR机器。此过程反复进行几次。实验表明,该方法在保持SVR良好泛化能力的同时,可以大大减少支持向量。支持向量回归(SVR)-稀疏性-RVM-自适应稀疏监督学习(ASSL)-KLASSO

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号