首页> 外文期刊>Journal of Approximation Theory >Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels
【24h】

Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels

机译:多元Bernstein-Durrmeyer算子的逼近和多元多项式核的最小二乘正则回归的学习率

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we establish error bounds for approximation by multivariate Bernstein-Durrmeyer operators in LρXp (1 ≤ p < ∞) with respect to a general Borel probability measure _(ρX) on a simplex X?Rn. By the error bounds, we provide convergence rates of type O (~(m -γ)) with some γ > 0 for the least-squares regularized regression algorithm associated with a multivariate polynomial kernel (where m is the sample size). The learning rates depend on the space dimension n and the capacity of the reproducing kernel Hilbert space generated by the polynomial kernel.
机译:在本文中,我们针对单纯形X?Rn上的一般Borel概率测度_(ρX),建立了LρXp(1≤p <∞)中多元Bernstein-Durrmeyer算子逼近的误差边界。通过误差界限,我们为与多元多项式核相关的最小二乘正则化回归算法提供了O(〜(m-γ))类型的收敛速率,其中γ> 0,其中m是样本大小。学习速率取决于空间维数n和多项式内核生成的再生内核Hilbert空间的容量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号