首页> 外文期刊>Neurocomputing >Study and evaluation of a multi-class SVM classifier using diminishing learning technique
【24h】

Study and evaluation of a multi-class SVM classifier using diminishing learning technique

机译:使用递减学习技术的多类SVM分类器的研究和评估

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Support vector machine (SVM) is one of the state-of-the-art tools for linear and non-linear pattern classification. One of the design objectives of an SVM classifier is reducing the number of support vectors without compromising the classification accuracy. For this purpose, a novel technique referred to as diminishing learning (DL) technique is proposed in this paper for a multiclass SVM classifier. In this technique, a sequential classifier is proposed wherein the classes which require stringent boundaries are tested one by one and once the tests for these classes fail, the stringency of the classifier is increasingly relaxed. An automated procedure is also proposed to obtain the optimum classification order for SVM-DL classifier in order to improve the recognition accuracy. The proposed technique is applied for SVM based isolated digit recognition system and is studied using speaker dependent and multispeaker dependent TI46 database of isolated digits. Both LPC and MFCC are used for feature extraction. The features extracted are mapped using self-organized feature maps (SOFM) for dimensionality reduction and the mapped features are used by SVM classifier to evaluate the recognition accuracy using various kernels. The performance of the system using the proposed SVM-DL classifier is compared with those using other techniques: one-against-all (OAA), half-against-half (HAH) and directed acyclic graph (DAG). SVM-DL classifier results in 1-2% increase in recognition accuracy compared to HAH classifier for some of the kernels with both LPC and MFCC feature inputs. For MFCC feature inputs, both HAH and SVM-DL classifiers have 100% recognition accuracy for some of the kernels. The total number of support vectors required is the least for HAH classifier followed by the SVM-DL classifier. The proposed diminishing learning technique is applicable for a number of pattern recognition applications.
机译:支持向量机(SVM)是用于线性和非线性模式分类的最新工具之一。 SVM分类器的设计目标之一是在不影响分类准确性的情况下减少支持向量的数量。为此,本文针对多类SVM分类器提出了一种称为递减学习(DL)技术的新技术。在该技术中,提出了一种顺序分类器,其中,对需要严格边界的类进行一次测试,并且一旦这些类的测试失败,分类器的严格性就会越来越宽松。为了提高识别精度,还提出了一种自动化程序来获得SVM-DL分类器的最佳分类顺序。所提出的技术被应用于基于SVM的孤立数字识别系统,并使用基于说话者和多说话者的TI46孤立数字数据库进行了研究。 LPC和MFCC均用于特征提取。使用自组织特征图(SOFM)映射提取的特征以进行降维,SVM分类器使用映射的特征使用各种内核评估识别精度。使用提议的SVM-DL分类器的系统性能与使用其他技术的系统性能进行了比较:一对一(OAA),半对半(HAH)和有向无环图(DAG)。对于同时具有LPC和MFCC功能输入的某些内核,与HAH分类器相比,SVM-DL分类器可提高1-2%的识别精度。对于MFCC功能输入,对于某些内核,HAH和SVM-DL分类器都具有100%的识别精度。 HAH分类器所需的支持向量总数最少,其次是SVM-DL分类器。所提出的递减学习技术适用于许多模式识别应用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号