首页> 外文期刊>Neurocomputing >A pre-selecting base kernel method in multiple kernel learning
【24h】

A pre-selecting base kernel method in multiple kernel learning

机译:多核学习中的预选基础核方法

获取原文
获取原文并翻译 | 示例

摘要

The pre-defined base kernel greatly affects the performance of multiple kernel learning (MKL), but selecting the pre-defined base kernel still has no theoretical guidance. In practice, it is very difficult to select a set of appropriate base kernels without prior knowledge. In this paper, we propose a general strategy to pre-select a reasonable set of base kernels before the optimization process of MKL solvers. This strategy is based on the combination of minimal redundancy maximal relevance criteria and kernel target alignment (MRMRKA). First, we determine some candidate kernels while maintaining diversity of information; second, a set of base kernels with high discriminative ability and large diversity are selected using the MRMRKA method. These pre-selected base kernels will be used in the optimization process of the existing MKL solvers to generate better results. The experiments conducted on UCI and 15-scene datasets show that the performance of MKL is improved with the proposed pre-selected base kernel strategy. (C) 2015 Elsevier B.V. All rights reserved.
机译:预定义的基本内核极大地影响了多内核学习(MKL)的性能,但是选择预定义的基本内核仍然没有理论指导。实际上,在没有先验知识的情况下,很难选择一组适当的基本内核。在本文中,我们提出了一种通用策略,可以在MKL求解器的优化过程之前预先选择一组合理的基本内核。该策略基于最小冗余最大相关性标准和内核目标对齐(MRMRKA)的组合。首先,我们在保持信息多样性的同时确定一些候选内核。其次,使用MRMRKA方法选择了一组具有较高判别能力和较大多样性的基本内核。这些预先选择的基本内核将在现有MKL求解器的优化过程中使用,以产生更好的结果。在UCI和15个场景的数据集上进行的实验表明,所提出的预选基本内核策略可提高MKL的性能。 (C)2015 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号