首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >The Maximum Vector-Angular Margin Classifier and its fast training on large datasets using a core vector machine
【24h】

The Maximum Vector-Angular Margin Classifier and its fast training on large datasets using a core vector machine

机译:最大矢量角余量分类器及其使用核心矢量机的大型数据集快速训练

获取原文
获取原文并翻译 | 示例
       

摘要

Although pattern classification has been extensively studied in the past decades, how to enecnveiy solve the corresponding training on large datasets is a problem that still requires particular attention. Many kernelized classification methods, such as SVM and SVDD, can be formulated as the corresponding quadratic programming (QP) problems, but computing the associated kernel matrices requires 0(n2) (or even up to O(n3)) computational complexity, where n is the size of the training patterns, which heavily limits the applicability of these methods for large datasets. In this paper, a new classification method called the Maximum Vector-Angular Margin Classifier (MAMC) is first proposed based on the Vector-Angular Margin to find an optimal vector c in the pattern feature space, and all the testing patterns can be classified in terms of the maximum vector-angular margin p, between the vector c and all the training data points. Accordingly, it is proved that the kernelized MAMC can be equivalently formulated as the kernelized Minimum Enclosing Ball (MEB), which leads to a distinctive merit of MAMC, i.e. it has the flexibility of controlling the sum of support vectors like u-SVC and may be extended to a Maximum Vector-Angular Margin Core Vector Machine (MAMCVM) by connecting the Core Vector Machine (CVM) method with MAMC such that the corresponding fast training on large datasets can be effectively achieved. Experimental results on artificial and real datasets are provided to validate the power of the proposed methods.
机译:尽管在过去几十年中已经对模式分类进行了广泛的研究,但是如何有效地解决大型数据集上的相应训练仍然是一个需要特别注意的问题。可以将许多内核化分类方法(例如SVM和SVDD)表述为相应的二次编程(QP)问题,但计算关联的内核矩阵需要0(n2)(甚至高达O(n3))的计算复杂度,其中n是训练模式的大小,严重限制了这些方法对大型数据集的适用性。本文首先基于向量角余量,提出了一种新的分类方法-最大向量角余量分类器(MAMC),在模式特征空间中找到最优向量c,并将所有测试模式分类为向量c和所有训练数据点之间的最大向量角余量p的项。因此,证明了核化的MAMC可以等效地表示为核化的最小封闭球(MEB),这导致了MAMC的独特优点,即,它具有控制支持向量之和(如u-SVC)的灵活性,并且可以通过将核心向量机(CVM)方法与MAMC连接起来,可以扩展到最大向量角边界核心向量机(MAMCVM),从而可以有效地在大型数据集上实现相应的快速训练。提供了关于人工和真实数据集的实验结果,以验证所提出方法的功能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号