...
首页> 外文期刊>Neurocomputing >Error analysis of regularized least-square regression with Fredholm kernel
【24h】

Error analysis of regularized least-square regression with Fredholm kernel

机译:用Fredholm核进行正则化最小二乘回归的误差分析

获取原文
获取原文并翻译 | 示例
           

摘要

Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized regression with Fred holm kernel, which implies that the fast learning rate O(l(-1)) can be reached under mild conditions (l is the number of labeled samples). Simulated examples show that this Fredholm regression algorithm can achieve the satisfactory prediction performance. (C) 2017 Elsevier B.V. All rights reserved.
机译:最近,使用Fredholm内核进行学习已经引起了越来越多的关注,因为它可以有效地利用数据信息来提高预测性能。尽管在理论和实验评估方面取得了快速进展,但其泛化分析尚未在学习理论文献中进行探索。在本文中,我们用Fred holm核建立最小二乘正则化回归的广义界,这意味着在温和条件下(l是标记样本的数量)可以达到快速学习率O(l(-1))。仿真实例表明,该Fredholm回归算法可以达到满意的预测性能。 (C)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号