首页> 中文期刊>计算机应用研究 >通用稀疏多核学习

通用稀疏多核学习

     

摘要

Considering that the L1 -norm multiple kernel learning(MKL)method may lead to discard useful informations and yield degenerated generalization performance when it produces sparse solution of the kernel weights,and Lp-norm (p >1)mul-tiple kernel learning (MKL)method may result in numerous redundant information and is sensitive to noise when the method produces the kernel weight with non-sparse solution.This paper proposed a method called generalized sparse MKL (GSMKL) method by introducing an elastic-net-type constraint on the kernel weights.More specifically,it was an MKL method with a constraint on the combination of the L1 -norm and Lp-norm (p >1 )on the kernel weights,which could not only adjust the sparseness flexibly but also encourage the grouping effect on the solution.And it believed that both L1 -norm MKL and Lp-norm MKL could be regarded as special cases.The mixed constraint in the method was non-linear constraint,and the method uti-lized second-order Taylor expansions to approximate the mixed constraint.Besides,it employed the semi-infinite program (SIP) to solve the optimization problem.Experimental results show that the improved algorithm,under the condition for the existence of dynamic adjustment sparseness,can not only achieve good classification performance,but also facilitate the grouping effect, so the improved algorithm is efficacious and feasible.%针对 L1范数多核学习方法产生核权重的稀疏解时可能会导致有用信息的丢失和泛化性能退化、Lp 范数多核学习方法产生核权重的非稀疏解时会产生很多冗余信息并对噪声敏感,提出了一种通用稀疏多核学习方法。该算法是基于 L1范数和 Lp 范数(p >1)混合的网状正则化多核学习方法,不仅能灵活地调整稀疏性,而且鼓励核权重的组效应,L1范数和 Lp 范数多核学习方法可以认为是该方法的特例。该方法引进的混合约束为非线性约束,对此约束采用二阶泰勒展开式近似,并使用半无限规划来求解该优化问题。实验结果表明,改进后的方法在动态调整稀疏性的前提下能获得较好的分类性能,同时也支持组效应,从而验证了改进后的方法是有效可行的。

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号