...
首页> 外文期刊>International journal of machine learning and cybernetics >Combining partially global and local characteristics for improved classification
【24h】

Combining partially global and local characteristics for improved classification

机译:结合部分全局和局部特征以改进分类

获取原文
获取原文并翻译 | 示例

摘要

The Support Vector Machine (SVM) has achieved promising classification performance. However, since it is based only on local information (Support Vectors), it is sensitive to directions with large data spread. On the other hand, Nonparametric Discriminant Analysis (NDA) is an improvement over the more general Linear Discriminant Analysis (LDA) where, the normality assumption from LDA is relaxed. Furthermore, NDA incorporates the partially global information to detect the dominant normal directions to the decision surface, which represent the true data spread. However, NDA relies on the choice of the k-nearest neighbors (k-NN's) on the decision boundary. This paper introduces a novel Combined SVM and NDA (CSVMNDA) model which controls the spread of the data, while maximizing a relative margin separating the data classes. This model is considered as an improvement to SVM by incorporating the data spread information represented by the dominant normal directions to the decision boundary. This can also be viewed as an extension to the NDA where the support vectors improve the choice of k-nearest neighbors (k-NN's) on the decision boundary by incorporating local information. Since our model is an extension to both SVM and NDA, it can deal with het-eroscedastic and non-normal data. It also avoids the small sample size problem. Interestingly, the proposed improvements only require a rigorous and simple combination of NDA and SVM objective functions, and preserve the computational efficiency of SVM. Through the optimization of the CSVMNDA objective function, surprising performance gains were achieved on real-world problems. In particular, the experiments on face recognition have clearly shown the superiority of CSVMNDA over other state-of-the-art classification methods, especially, SVM and NDA.
机译:支持向量机(SVM)取得了令人满意的分类性能。但是,由于它仅基于本地信息(支持向量),因此对数据量大的方向很敏感。另一方面,非参数判别分析(NDA)是对更为通用的线性判别分析(LDA)的改进,在线性判别分析(LDA)中,LDA的正态性假设得到了放宽。此外,NDA结合了部分全局信息,以检测代表决策面的主导法线方向,这些方向代表了真实的数据传播。但是,NDA依赖于决策边界上的k个最近邻居(k-NN)的选择。本文介绍了一种新颖的SVM和NDA组合(CSVMNDA)模型,该模型控制数据的传播,同时最大化分离数据类的相对余量。通过将主要法线方向表示的数据扩展信息纳入决策边界,该模型被认为是对SVM的改进。这也可以看作是NDA的扩展,其中支持向量通过结合本地信息来改善决策边界上的k个近邻(k-NN)的选择。由于我们的模型是SVM和NDA的扩展,因此它可以处理异方差和非正态数据。它还避免了样本量小的问题。有趣的是,提出的改进只需要NDA和SVM目标函数的严格而简单的组合,并保留SVM的计算效率。通过优化CSVMNDA目标函数,在实际问题上获得了令人惊讶的性能提升。特别是,人脸识别实验清楚地表明了CSVMNDA相对于其他最新分类方法(尤其是SVM和NDA)的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号