首页> 外文期刊>Neural Networks, IEEE Transactions on >Structural Regularized Support Vector Machine: A Framework for Structural Large Margin Classifier
【24h】

Structural Regularized Support Vector Machine: A Framework for Structural Large Margin Classifier

机译:结构化正则支持向量机:结构化大利润率分类器的框架

获取原文
获取原文并翻译 | 示例
           

摘要

Support vector machine (SVM), as one of the most popular classifiers, aims to find a hyperplane that can separate two classes of data with maximal margin. SVM classifiers are focused on achieving more separation between classes than exploiting the structures in the training data within classes. However, the structural information, as an implicit prior knowledge, has recently been found to be vital for designing a good classifier in different real-world problems. Accordingly, using as much prior structural information in data as possible to help improve the generalization ability of a classifier has yielded a class of effective structural large margin classifiers, such as the structured large margin machine (SLMM) and the Laplacian support vector machine (LapSVM). In this paper, we unify these classifiers into a common framework from the concept of “structural granularity” and the formulation for optimization problems. We exploit the quadratic programming (QP) and second-order cone programming (SOCP) methods, and derive a novel large margin classifier, we call the new classifier the structural regularized support vector machine (SRSVM). Unlike both SLMM at the cross of the cluster granularity and SOCP and LapSVM at the cross of the point granularity and QP, SRSVM is located at the cross of the cluster granularity and QP and thus follows the same optimization formulation as LapSVM to overcome large computational complexity and non-sparse solution in SLMM. In addition, it integrates the compactness within classes with the separability between classes simultaneously. Furthermore, it is possible to derive generalization bounds for these algorithms by using eigenvalue analysis of the kernel matrices. Experimental results demonstrate that SRSVM is often superior in classification and generalization performances to the state-of-the-art algorithms in the framework, both with the same and different structural granularities.
机译:支持向量机(SVM)作为最流行的分类器之一,旨在找到一种可以以最大余量分离两类数据的超平面。 SVM分类器着重于实现类之间的更多分离,而不是利用类中训练数据中的结构。然而,作为隐含的先验知识,最近发现结构信息对于在不同的实际问题中设计好的分类器至关重要。因此,在数据中使用尽可能多的先验结构信息以帮助提高分类器的泛化能力已产生了一类有效的结构化大边缘分类器,例如结构化大边缘机器(SLMM)和拉普拉斯支持向量机(LapSVM) )。在本文中,我们从“结构粒度”的概念和优化问题的公式将这些分类器统一为一个通用框架。我们利用二次规划(QP)和二阶锥规划(SOCP)方法,并推导了一种新颖的大余量分类器,我们将新分类器称为结构化正则化支持向量机(SRSVM)。与位于群集粒度和QP交叉处的SLMM和位于群集粒度和QP交叉处的LapSVM不同,SRSVM位于群集粒度和QP的交叉处,因此遵循与LapSVM相同的优化公式来克服较大的计算复杂性和SLMM中的非稀疏解决方案。此外,它将类内部的紧凑性与类之间的可分离性同时集成。此外,可以通过使用内核矩阵的特征值分析来得出这些算法的泛化边界。实验结果表明,在结构粒度相同和不同的情况下,SRSVM在分类和泛化性能方面通常优于框架中的最新算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号