首页> 外文OA文献 >Investigation of feature dimension reduction schemes for classification applications
【2h】

Investigation of feature dimension reduction schemes for classification applications

机译:分类应用的特征尺寸缩减方案研究

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Extracting relevant features that allow for class discrimination is the first critical step in classification applications. However, this step often leads to high-dimensional feature spaces, which requires large datasets to create viable classification schemes. As a result, there is a strong incentive to reduce the feature space dimension. Two classical types of approaches to reduce feature dimension exist Principal Component Analysis (PCA)-based or discriminant-based approaches. The main difference between the two types lies in the criterion selected; PCA-based schemes seek a projection direction which bests represents the data in a norm sense, while discriminant-based schemes seek a projection that best separates the class data. This study presents a comparison of three discriminant-based feature dimension reduction schemes: the Mean Separator Neural Network (MSNN), the Mahalanobis-based Dimension Reduction scheme (MBDR), and the kernel-based Generalized Discriminant Analysis (GDA) approach. PCA is included for comparison purposes as it is also widely used in classification applications. All four feature dimension reduction schemes are implemented and evaluated by applying the transformed features to a basic minimum distance classifier. Three classification datasets commonly used in statistics for benchmarking purposes are selected to compare the schemes and results discussed Results show the kernel-based generalized discriminant analysis approach to lead to consistently higher classification performances than the other schemes considered in the study for the data investigated.
机译:提取允许分类的相关功能是分类应用程序中的第一步。但是,此步骤通常会产生高维特征空间,这需要大型数据集才能创建可行的分类方案。结果,强烈希望减小特征空间尺寸。基于主成分分析(PCA)或基于判别式的方法有两种经典的减少特征维的方法。两种类型的主要区别在于选择的标准。基于PCA的方案寻求在规范意义上最好地表示数据的投影方向,而基于判别式的方案则寻求最好地分离类数据的投影。这项研究提出了三种基于判别式的特征降维方案的比较:均值分离器神经网络(MSNN),基于Mahalanobis的降维方案(MBDR)和基于核的广义判别分析(GDA)方法。包括PCA是为了进行比较,因为它也广泛用于分类应用程序中。通过将转换后的特征应用于基本的最小距离分类器,可以实现和评估所有四个特征尺寸缩减方案。选择了三个通常用于基准测试的统计数据分类数据集,以比较方案和讨论的结果。结果表明,基于核的广义判别分析方法与研究中考虑的其他数据相比,导致归类性能始终高于其他方案。

著录项

  • 作者

    Fargues Monique P.;

  • 作者单位
  • 年度 2001
  • 总页数
  • 原文格式 PDF
  • 正文语种
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号