【24h】

Coefficient Structure of Kernel Perceptrons and Support Vector Reduction

机译:核感知器的系数结构和支持向量约简

获取原文
获取原文并翻译 | 示例

摘要

Support Vector Machines (SVMs) with few support vectors are quite desirable, as they have a fast application to new, unseen patterns. In this work we shall study the coefficient structure of the dual representation of SVMs constructed for nonlinearly separable problems through kernel perceptron training. We shall relate them with the margin of their support vectors (SVs) and also with the number of iterations in which these SVs take part. These considerations will lead to a remove-and-retrain procedure for building SVMs with a small number of SVs where both suitably small and large coefficient SVs will be taken out from the training sample. Besides providing a significant SV reduction, our method's computational cost is comparable to that of a single SVM training.
机译:非常需要具有很少支持向量的支持向量机(SVM),因为它们可以快速应用于新的,看不见的模式。在这项工作中,我们将研究通过核感知器训练为非线性可分离问题构建的SVM双重表示的系数结构。我们将它们与它们的支持向量(SV)的余量以及这些SV参与的迭代次数相关联。这些考虑因素将导致使用少量SV构建SVM的移除和重新训练程序,其中将从训练样本中取出适当大小的系数SV。除了显着降低SV外,我们的方法的计算成本与单次SVM训练的成本相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号