...
首页> 外文期刊>International Journal of Wavelets, Multiresolution and Information Processing >Convergence rate of SVM for kernel-based robust regression
【24h】

Convergence rate of SVM for kernel-based robust regression

机译:基于内核的鲁棒回归SVM的收敛速度

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

It is known that to alleviate the performance deterioration caused by the outliers, the robust support vector (SV) regression is proposed, which is essentially a convex optimization problem associated with a non-convex loss function. The theory analysis for its performance cannot. be finished by the usual convex analysis approach. For a robust SV regression algorithm containing two homotopy parameters, a non-convex method is developed with the quasiconvex analysis theory and the error estimate is given. An explicit convergence rate is provided, and the effect degree of outliers on the performance is quantitatively shown.
机译:众所周知,为了减轻由异常值引起的性能劣化,提出了鲁棒支持向量(SV)回归,这基本上是与非凸损耗相关的凸优化问题。 其表现的理论分析不能。 通过通常的凸分析方法完成。 对于包含两个同型参数的鲁棒SV回归算法,使用QuasicOnvex分析理论而开发了非凸法方法,并给出了误差估计。 提供显式收敛速率,并且定量显示了性能上的异常值的效果程度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号