【24h】

Localized Multiple Kernel Regression

机译:局部多核回归

获取原文

摘要

Multiple kernel learning (MKL) uses a weighted combination of kernels where the weight of each kernel is optimized during training. However, MKL assigns the same weight to a kernel over the whole input space. Our main objective is the formulation of the localized multiple kernel learning (LMKL) framework that allows kernels to be combined with different weights in different regions of the input space by using a gating model. In this paper, we apply the LMKL framework to regression estimation and derive a learning algorithm for this extension. Canonical support vector regression may over fit unless the kernel parameters are selected appropriately; we see that even if provide more kernels than necessary, LMKL uses only as many as needed and does not overfit due to its inherent regularization.
机译:多核学习(MKL)使用核的加权组合,其中在训练过程中优化了每个核的权重。但是,MKL在整个输入空间上为内核分配相同的权重。我们的主要目标是制定局部多核学习(LMKL)框架,该框架允许通过使用门控模型在输入空间的不同区域中将具有不同权重的内核组合在一起。在本文中,我们将LMKL框架应用于回归估计并推导了针对该扩展的学习算法。规范支持向量回归可以在适合除非内核参数适当地选择;我们看到,即使提供的内核数量超出了必要,LMKL只会使用所需数量的内核,并且不会因其固有的正则化而过度安装。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号