首页> 外文期刊>Foundations of computational mathematics >Optimal rates for the regularized least-squares algorithm
【24h】

Optimal rates for the regularized least-squares algorithm

机译:正则化最小二乘算法的最佳速率

获取原文
获取原文并翻译 | 示例
           

摘要

We develop a theoretical analysis of the performance of the regularized least-square algorithm on a reproducing kernel Hilbert space in the supervised learning setting. The presented results hold in the general framework of vector-valued functions; therefore they can be applied to multitask problems. In particular, we observe that the concept of effective dimension plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. Moreover, a complete minimax analysis of the problem is described, showing that the convergence rates obtained by regularized least-squares estimators are indeed optimal over a suitable class of priors defined by the considered kernel. Finally, we give an improved lower rate result describing worst asymptotic behavior on individual probability measures rather than over classes of priors.
机译:我们在有监督的学习环境中对可再生内核希尔伯特空间上的正则化最小二乘算法的性能进行了理论分析。提出的结果保存在向量值函数的一般框架中;因此它们可以应用于多任务问题。特别是,我们观察到有效维数的概念在定义正则化参数作为样本数量函数的标准定义中起着核心作用。而且,描述了对该问题的完整的极小极大分析,表明由正则化最小二乘估计器获得的收敛速度实际上在所考虑的核定义的合适先验类别上是最优的。最后,我们给出了一个改进的较低利率结果,该结果描述了在个别概率测度上而不是在先验类别上的最坏渐近行为。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号