...
【24h】

Localized algorithms for multiple kernel learning

机译:用于多内核学习的本地化算法

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Instead of selecting a single kernel, multiple kernel learning (MKL) uses a weighted sum of kernels where the weight of each kernel is optimized during training. Such methods assign the same weight to a kernel over the whole input space, and we discuss localized multiple kernel learning (LMKL) that is composed of a kernel-based learning algorithm and a parametric gating model to assign local weights to kernel functions. These two components are trained in a coupled manner using a two-step alternating optimization algorithm. Empirical results on benchmark classification and regression data sets validate the applicability of our approach. We see that LMKL achieves higher accuracy compared with canonical MKL on classification problems with different feature representations. LMKL can also identify the relevant parts of images using the gating model as a saliency detector in image recognition problems. In regression tasks, LMKL improves the performance significantly or reduces the model complexity by storing significantly fewer support vectors.
机译:代替选择单个内核,多内核学习(MKL)使用内核的加权总和,其中在训练过程中优化了每个内核的权重。这些方法在整个输入空间上为内核分配了相同的权重,并且我们讨论了局部多核学习(LMKL),该算法由基于内核的学习算法和参数化门控模型组成,以将局部权重分配给内核函数。使用两步交替优化算法以耦合方式训练这两个组件。基准分类和回归数据集的经验结果验证了我们方法的适用性。我们看到,在具有不同特征表示的分类问题上,与经典MKL相比,LMKL实现了更高的准确性。 LMKL还可以使用门控模型作为图像识别问题中的显着性检测器来识别图像的相关部分。在回归任务中,LMKL通过存储少得多的支持向量来显着提高性能或降低模型复杂性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号