【24h】

A Divergence Criterion for Classifier-Independent Feature Selection

机译:与分类器无关的特征选择的发散准则

获取原文
获取原文并翻译 | 示例

摘要

Feature selection aims to find the most important feature subset from a given feature set without degradation of discriminative information. In general, we wish to select a feature subset that is effective for any kind of classifier. Such studies are called Classifier-Independent Feature Selection, and Novovicova et al.'s method is one of them. Their method estimates the densities of classes with Gaussian mixture models, and selects a feature subset using Kullback-Leibler divergence between the estimated densities, but there is no indication how to choose the number of features to be selected. Kudo and Sklansky (1997) suggested the selection of a minimal feature subset such that the degree of degradation of performance is guaranteed. In this study, based on their suggestion, we try to find a feature subset that is minimal while maintainig a given Kullback-Leibler divergence.
机译:特征选择旨在从给定的特征集中找到最重要的特征子集,而不会降低区分性信息。通常,我们希望选择对任何种类的分类器都有效的特征子集。这种研究称为独立于分类器的特征选择,而Novovicova等人的方法就是其中之一。他们的方法使用高斯混合模型估计类的密度,并使用估计密度之间的Kullback-Leibler散度来选择特征子集,但是没有指示如何选择要选择的特征数量。 Kudo和Sklansky(1997)建议选择最小特征子集,以确保性能下降的程度。在这项研究中,基于他们的建议,我们尝试找到一个最小的特征子集,同时保持给定的Kullback-Leibler散度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号