首页> 外文会议>Artificial neural networks in pattern recognition >Evaluation of Feature Selection by Multiclass Kernel Discriminant Analysis
【24h】

Evaluation of Feature Selection by Multiclass Kernel Discriminant Analysis

机译:基于多类核判别分析的特征选择评估

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we propose and evaluate the feature selection criterion based on kernel discriminant analysis (KDA) for multiclass problems, which finds the number of classes minus one eigenvectors. The selection criterion is the sum of the objective function of KDA, namely the sum of eigenvalues associated with the eigenvectors. In addition to the KDA criterion, we propose a new selection criterion that replaces the between-class scatter in KDA with the sum of square distances between all pairs of classes. To speed up backward feature selection, we introduce block deletion, which deletes many features at a timeC and to enhance generalization ability of the selected features we use cross-validation as a stopping condition. By computer experiments using benchmark datasets, we show that the KDA criterion has performance comparable with that of the selection criterion based on the SVM-based recognition rate with cross-validation and can reduce computational cost. We also show that the KDA criterion can terminate feature selection stably using cross-validation as a stopping condition.
机译:在本文中,我们提出并评估了基于核判别分析(KDA)的多类问题的特征选择准则,该准则选择的类数减去一个特征向量。选择标准是KDA目标函数的总和,即与特征向量关联的特征值的总和。除了KDA准则,我们提出了一种新的选择准则,该准则将KDA中的类间散布替换为所有类对之间的平方距离之和。为了加快后向特征选择的速度,我们引入了块删除功能,该功能可一次删除多个特征,并使用交叉验证作为停止条件,以增强所选特征的泛化能力。通过使用基准数据集的计算机实验,我们表明KDA准则的性能可与基于交叉验证的基于SVM的识别率的选择准则相媲美,并且可以降低计算成本。我们还显示,KDA标准可以使用交叉验证作为停止条件来稳定地终止特征选择。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号