...
首页> 外文期刊>Journal of machine learning research >Sobolev Norm Learning Rates for Regularized Least-Squares Algorithms
【24h】

Sobolev Norm Learning Rates for Regularized Least-Squares Algorithms

机译:SOBOLEV规范最小二乘算法的学习速率

获取原文
           

摘要

Learning rates for least-squares regression are typically expressed in terms of $L_2$-norms. In this paper we extend these rates to norms stronger than the $L_2$-norm without requiring the regression function to be contained in the hypothesis space. In the special case of Sobolev reproducing kernel Hilbert spaces used as hypotheses spaces, these stronger norms coincide with fractional Sobolev norms between the used Sobolev space and $L_2$. As a consequence, not only the target function but also some of its derivatives can be estimated without changing the algorithm. From a technical point of view, we combine the well-known integral operator techniques with an embedding property, which so far has only been used in combination with empirical process arguments. This combination results in new finite sample bounds with respect to the stronger norms. From these finite sample bounds our rates easily follow. Finally, we prove the asymptotic optimality of our results in many cases.
机译:最小二乘回归的学习速率通常以$ l_2 $ -norms表示。 在本文中,我们将这些速率扩展到规范比$ l_2 $ -norm,而不需要在假设空间中包含回归函数。 在SoboLev的特殊情况下,SOBOLEV再现用作假设空间的内核HILBERT空间,这些更强的规范与使用的SOBOLEV空间和$ L_2 $之间的分数SOBOLEV规范一致。 结果,不仅可以估计目标功能,而且可以在不改变算法的情况下估算其一些衍生物。 从技术角度来看,我们将众所周知的积分操作技术与嵌入性相结合,迄今为止仅与实证过程参数结合使用。 这种组合导致新的有限样品相对于更强的规范。 从这些有限的样品界定我们的房价轻松遵循。 最后,在许多情况下,我们证明了我们结果的渐近最优性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号