...
首页> 外文期刊>Journal of Multivariate Analysis: An International Journal >Asymptotic normality of support vector machine variants and other regularized kernel methods
【24h】

Asymptotic normality of support vector machine variants and other regularized kernel methods

机译:支持向量机变体和其他正则化核方法的渐近正态性

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions L, it is shown that the difference between the estimator, i.e. the empirical SVM fL,Dn,λDn, and the theoretical SVM fL,P,λ0 is asymptotically normal with rate n. That is, n(fL,Dn,λDn-fL,P,λ0) converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter D_n in fL,Dn,λDn may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional P{mapping}f_(L,P,λ) is suitably Hadamard-differentiable.
机译:在非参数分类和回归问题中,正则化核方法,特别是支持向量机,在理论和应用统计中引起了极大的关注。从抽象的意义上讲,可以将正则化内核方法(在这里简称为SVM)看作是(通常为无穷维)再现内核Hilbert空间中参数的正则化M估计量。对于平滑损失函数L,证明了估计量(即经验SVM fL,Dn,λDn和理论SVM fL,P,λ0)之间的差在速率n处是渐近正态的。即,在再现核希尔伯特空间中,n(fL,Dn,λDn-fL,P,λ0)收敛到高斯过程。如在实际应用中一样,在fL,Dn,λDn中对正则化参数D_n的选择可能取决于数据。通过应用功能增量方法并通过证明SVM功能的P {mapping} f_(L,P,λ)可以适当地由Hadamard微分来进行证明。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号