首页> 美国政府科技报告 >Fast Rates for Support Vector Machines
【24h】

Fast Rates for Support Vector Machines

机译:支持向量机的快速费率

获取原文

摘要

We establish learning rates to the Bayes risk for support vector machines with hinge loss (L1-SVMs). Since a theorem of Devroye states that no learning algorithm can learn with a uniform rate to the Bayes risk for all probability distributions we have to restrict the class of considered distributions: in order to obtain fast rates we assume a noise condition recently proposed by Tsybakov and an approximation condition in terms of the distribution and the reproducing kernel Hilbert space used by the L1-SVM. For Gaussian RBF kernels with varying widths we propose a geometric noise assumption on the distribution which ensures the approximation condition. This geometric assumption is not in terms of smoothness but describes the concentration of the marginal distribution near the decision boundary. In particular we are able to describe nontrivial classes of distributions for which L1-SVMs using a Gaussian kernel can learn with almost linear rate. We use various new and recently introduced techniques for establishing our results: the analysis of the estimation error is based on Talagrands concentration inequality and local Rademacher averages. We furthermore develope a shrinking technique which allows us to control the typical size of the norm of the L1-SVM solution. It turns out that the above mentioned approximation assumption has a crucial impact on both the application of Talagrands inequality and the shrinking technique. Moreover, for Gaussian kernels we develope a smoothing technique which allows us to treat the approximation error in a way directly linked to the classi.cation problem. Finally, we prove some new bounds on covering numbers related to Gaussian RBF kernels.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号