首页> 外文期刊>Optimization methods & software >Least squares twin support vector machine classification via maximum one-class within class variance
【24h】

Least squares twin support vector machine classification via maximum one-class within class variance

机译:通过类方差内的最大一类最小二乘双支持向量机分类

获取原文
获取原文并翻译 | 示例
           

摘要

A twin support vector machine (TWSVM), as an effective classification tool, tries to find two non-parallel planes that can be produced by solving two quadratic programming problems (QPPs). The QPPs lead to higher computational costs. The least squares twin SVM (LSTSVM), as a variant of TWSVM, attempts to avoid the above deficiency and obtain two non-parallel planes directly by solving two sets of linear equations. Both TWSVM and LSTSVM operate directly on patterns using two optimizations with constraints and, respectively, use such constraints to estimate the distance between each plane for its own class and patterns of other classes. However, such approaches weaken the geometric interpretation of the generalized proximal SVM (GEPSVM) so that in many Exclusive Or examples for different distributions, they may obtain the worse classification performance. Moreover, in addition to failing to discover the local geometry inside the samples, they are sensitive to outliers. In this paper, inspired by several geometrically motivated learning algorithms and the advantages of LSTSVM, we first propose a new classifier, called LSTSVM classification via maximum one-class within-class variance (MWSVM), which is specially designed for avoiding the aforementioned deficiencies and keeping the advantages of LSTSVM. The new method directly incorporates the one-class within-class variance to the classifier, such that it is expected that the genuine geometric interpretation of GEPSVM can be kept in LSTSVM. Undoubtedly, like LSTSVM, MWSVM may lead to a worse classification performance in many cases, especially when the outliers are present. Therefore, a localized version (LMWSVM) of MWSVM is further proposed to remove the outliers effectively. Another advantage of LMWSVM is that it takes the nearby points which are closest to each other as a training set, such that the MWSVM classifier is determined by smaller size of training samples than that of LSTSVM. Naturally, it can reduce the storage space of LSTSVM, especially when extended to nonlinear cases. Experiments carried out on both toy and real-world problems disclose the effectiveness of both MWSVM and LMWSVM.
机译:孪生支持向量机(TWSVM)作为有效的分类工具,试图找到两个非平行平面,这些平面可以通过解决两个二次规划问题(QPP)来产生。 QPP导致更高的计算成本。最小二乘孪生SVM(LSTSVM),作为TWSVM的一种,它试图避免上述缺陷,并通过求解两组线性方程直接获得两个非平行平面。 TWSVM和LSTSVM都使用两个带约束的优化直接在模式上运行,并且分别使用此类约束来估计每个平面之间的距离以用于其自身的类和其他类的模式。但是,此类方法削弱了广义近端SVM(GEPSVM)的几何解释,因此在许多针对不同分布的“异或”示例中,它们可能会获得较差的分类性能。此外,除了无法发现样本内部的局部几何形状外,它们还对异常值敏感。在本文中,受几种具有几何动机的学习算法和LSTSVM的优势的启发,我们首先提出一种新的分类器,称为LSTSVM分类器,它通过最大的一类类内方差(MWSVM)进行分类,旨在避免上述缺陷和缺陷。保持LSTSVM的优势。该新方法直接将一类内部类差异纳入分类器,因此可以期望可以将GEPSVM的真实几何解释保留在LSTSVM中。无疑,与LSTSVM一样,MWSVM在许多情况下都可能导致较差的分类性能,尤其是当存在异常值时。因此,进一步提出了MWSVM的本地化版本(LMWSVM),以有效地消除异常值。 LMWSVM的另一个优点是,它将彼此最接近的附近点作为训练集,这样,与LSTSVM相比,MWSVM分类器由较小的训练样本确定。自然,它可以减少LSTSVM的存储空间,尤其是在扩展到非线性情况时。针对玩具和现实问题进行的实验揭示了MWSVM和LMWSVM的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号