【24h】

Evolutionary learning with kernels

机译:内核的进化学习

获取原文

摘要

In this paper we embed evolutionary computation into statistical learning theory. First, we outline the connection between large margin optimization and statistical learning and see why this paradigm is successful for many pattern recognition problems. We then embed evolutionary computation into the most prominent representative of this class of learning methods, namely into Support Vector Machines (SVM). In contrast to former applications of evolutionary algorithms to SVMs we do not only optimize the method or kernel parameters. We rather use both evolution strategies and particle swarm optimization in order to directly solve the posed constrained optimization problem. Transforming the problem into the Wolfe dual reduces the total runtime and allows the usage of kernel functions. Exploiting the knowledge about this optimization problem leads to a hybrid mutation which further decreases convergence time while classification accuracy is preserved. We will show that evolutionary SVMs are at least asaccurate as their quadratic programming counterparts on six real-world benchmark data sets. The evolutionary SVM variants frequently outperform their quadratic programming competitors. Additionally, the proposed algorithm is more generic than existing traditional solutions since it will also work for non-positive semidefinite kernel functions and for several, possibly competing, performance criteria.
机译:在本文中,我们将进化计算嵌入到统计学习理论中。首先,我们概述了大幅度优化与统计学习之间的联系,并了解了为什么该范式成功解决了许多模式识别问题。然后,我们将进化计算嵌入到此类学习方法的最杰出代表中,即嵌入到支持向量机(SVM)中。与进化算法以前对SVM的应用相反,我们不仅优化了方法或内核参数。为了直接解决所提出的约束优化问题,我们宁愿同时使用进化策略和粒子群优化。将问题转换为Wolfe dual可以减少总运行时间,并允许使用内核函数。利用有关此优化问题的知识会导致混合突变,从而进一步缩短了收敛时间,同时又保留了分类精度。我们将显示,在六个实际基准数据集上,进化型SVM至少与它们的二次编程相对应。进化型SVM变体经常胜过其二次编程竞争对手。此外,该算法比现有的传统解决方案更具通用性,因为它也适用于非正半定核函数以及几种(可能是竞争的)性能标准。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号