首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >Fast Gaussian kernel learning for classification tasks based on specially structured global optimization
【24h】

Fast Gaussian kernel learning for classification tasks based on specially structured global optimization

机译:基于特殊结构的全局优化的分类任务的快速高斯核学习

获取原文
获取原文并翻译 | 示例
           

摘要

For a practical pattern classification task solved by kernel methods, the computing time is mainly spent on kernel learning (or training). However, the current kernel learning approaches are based on local optimization techniques, and hard to have good time performances, especially for large datasets. Thus the existing algorithms cannot be easily extended to large-scale tasks. In this paper, we present a fast Gaussian kernel learning method by solving a specially structured global optimization (SSGO) problem. We optimize the Gaussian kernel function by using the formulated kernel target alignment criterion, which is a difference of increasing (d.i.) functions. Through using a power-transformation based convexification method, the objective criterion can be represented as a difference of convex (d.c.) functions with a fixed power-transformation parameter. And the objective programming problem can then be converted to a SSGO problem: globally minimizing a concave function over a convex set. The SSGO problem is classical and has good solvability. Thus, to find the global optimal solution efficiently, we can adopt the improved Hoffman's outer approximation method, which need not repeat the searching procedure with different starting points to locate the best local minimum. Also, the proposed method can be proven to converge to the global solution for any classification task. We evaluate the proposed method on twenty benchmark datasets, and compare it with four other Gaussian kernel learning methods. Experimental results show that the proposed method stably achieves both good time-efficiency performance and good classification performance.
机译:对于通过内核方法解决的实用模式分类任务,计算时间主要花费在内核学习(或训练)上。但是,当前的内核学习方法基于局部优化技术,并且很难具有良好的时间性能,尤其是对于大型数据集。因此,现有算法不能轻易地扩展到大规模任务。在本文中,我们通过解决特殊结构的全局优化(SSGO)问题提出了一种快速的高斯核学习方法。我们通过使用制定的内核目标对齐准则优化高斯内核函数,这是增加(d.i.)函数的区别。通过使用基于幂变换的凸化方法,可以将客观标准表示为具有固定幂变换参数的凸(d.c.)函数的差。然后可以将目标编程问题转换为SSGO问题:全局最小化凸集上的凹函数。 SSGO问题是经典问题,具有良好的可解决性。因此,为了有效地找到全局最优解,我们可以采用改进的霍夫曼外逼近方法,该方法无需以不同的起点重复搜索过程即可找到最佳局部最小值。而且,可以证明所提出的方法可以收敛到针对任何分类任务的全局解决方案。我们在20个基准数据集上评估了该方法,并将其与其他四种高斯核学习方法进行了比较。实验结果表明,该方法稳定地实现了良好的时效性能和良好的分类性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号