...
首页> 外文期刊>IEEE Transactions on Neural Networks >Maxi–Min Margin Machine: Learning Large Margin Classifiers Locally and Globally
【24h】

Maxi–Min Margin Machine: Learning Large Margin Classifiers Locally and Globally

机译:最大-最小保证金机器:在本地和全球学习大型保证金分类器

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we propose a novel large margin classifier, called the maxi–min margin machine $({ M}^{4})$. This model learns the decision boundary both locally and globally. In comparison, other large margin classifiers construct separating hyperplanes only either locally or globally. For example, a state-of-the-art large margin classifier, the support vector machine (SVM), considers data only locally, while another significant model, the minimax probability machine (MPM), focuses on building the decision hyperplane exclusively based on the global information. As a major contribution, we show that SVM yields the same solution as ${ M}^{4}$ when data satisfy certain conditions, and MPM can be regarded as a relaxation model of ${ M}^{4}$ . Moreover, based on our proposed local and global view of data, another popular model, the linear discriminant analysis, can easily be interpreted and extended as well. We describe the ${ M}^{4}$ model definition, provide a geometrical interpretation, present theoretical justifications, and propose a practical sequential conic programming method to solve the optimization problem. We also show how to exploit Mercer kernels to extend ${ M}^{4}$ for nonlinear classifications. Furthermore, we perform a series of evaluations on both synthetic data sets and real-world benchmark data sets. Comparison with SVM and MPM demonstrates the advantages of our new model.
机译:在本文中,我们提出了一种新颖的大边际分类器,称为最大最小边际机器$({M} ^ {4})$。该模型可以学习本地和全局的决策边界。相比之下,其他大余量分类器仅局部或全局构造分离的超平面。例如,最先进的大余量分类器支持向量机(SVM)仅在本地考虑数据,而另一个重要模型最小极大概率机(MPM)则专注于专门基于以下内容构建决策超平面全球信息。作为主要贡献,我们证明了当数据满足特定条件时SVM产生与$ {M} ^ {4} $相同的解决方案,而MPM可以视为$ {M} ^ {4} $的松弛模型。而且,基于我们提出的局部和全局数据视图,可以轻松地解释和扩展另一个流行的模型,即线性判别分析。我们描述了$ {M} ^ {4} $模型定义,提供了几何解释,给出了理论依据,并提出了一种实用的顺序圆锥编程方法来解决优化问题。我们还将展示如何利用Mercer内核来扩展$ {M} ^ {4} $进行非线性分类。此外,我们对综合数据集和实际基准数据集进行了一系列评估。与SVM和MPM的比较证明了我们新模型的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号