首页> 美国政府科技报告 >Investigation of Feature Dimension Reduction Schemes for Classification Applications
【24h】

Investigation of Feature Dimension Reduction Schemes for Classification Applications

机译:分类应用特征降维方案研究

获取原文

摘要

Extracting relevant features that allow for class discrimination is the first critical step in classification applications. However, this step often leads to high-dimensional feature spaces, which requires large datasets to create viable classification schemes. As a result, there is a strong incentive to reduce the feature space dimension. Two classical types of approaches to reduce feature dimension exist Principal Component Analysis (PCA)-based or discriminant-based approaches. The main difference between the two types lies in the criterion selected; PCA-based schemes seek a projection direction which bests represents the data in a norm sense, while discriminant-based schemes seek a projection that best separates the class data. This study presents a comparison of three discriminant-based feature dimension reduction schemes: the Mean Separator Neural Network (MSNN), the Mahalanobis-based Dimension Reduction scheme (MBDR), and the kernel-based Generalized Discriminant Analysis (GDA) approach. PCA is included for comparison purposes as it is also widely used in classification applications. All four feature dimension reduction schemes are implemented and evaluated by applying the transformed features to a basic minimum distance classifier. Three classification datasets commonly used in statistics for benchmarking purposes are selected to compare the schemes and results discussed Results show the kernel-based generalized discriminant analysis approach to lead to consistently higher classification performances than the other schemes considered in the study for the data investigated.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号