【24h】

Deriving the Kernel from Training Data

机译:从训练数据派生内核

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

In this paper we propose a strategy for constructing data-driven kernels, automatically determined by the training examples. Basically, their associated Reproducing Kernel Hilbert Spaces arise from finite sets of linearly independent functions, that can be interpreted as weak classifiers or regressors, learned from training material. When working in the Tikhonov regularization framework, the unique free parameter to be optimized is the regularizer, representing a trade-off between empirical error and smoothness of the solution. A generalization error bound based on Rademacher complexity is provided, yielding the potential for controlling overfitting.
机译:在本文中,我们提出了一种由训练示例自动确定的构造数据驱动内核的策略。基本上,它们的相关联的复制内核希尔伯特空间源自线性独立函数的有限集合,这些函数可以从培训材料中学到,可以解释为弱分类器或回归器。在Tikhonov正则化框架中工作时,要优化的唯一自由参数是正则化器,它代表了经验误差和解的平滑度之间的权衡。提供了基于Rademacher复杂度的泛化误差界限,从而产生了控制过度拟合的可能性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号