...
首页> 外文期刊>Entropy >Quadratic Mutual Information Feature Selection
【24h】

Quadratic Mutual Information Feature Selection

机译:二次互信息特征选择

获取原文
   

获取外文期刊封面封底 >>

       

摘要

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous data, excluding any discretization; and (ii) its parameter-free design. The effectiveness of the proposed method is demonstrated through an extensive comparison with mutual information feature selection (MIFS), minimum redundancy maximum relevance (MRMR), and joint mutual information (JMI) on classification and regression problem domains. The experiments show that proposed method performs comparably to the other methods when applied to classification problems, except it is considerably faster. In the case of regression, it compares favourably to the others, but is slower.
机译:我们提出了一种基于二次互信息的新颖特征选择方法,该方法起源于柯西-施瓦兹散度和人一熵。该方法使用高斯核函数直接从数据样本中估计二次互信息,并可以检测二阶非线性关系。它的主要优点是:(i)对离散和连续数据进行统一分析,不包括任何离散化; (ii)无参数设计。通过与互信息特征选择(MIFS),最小冗余最大相关性(MRMR)和分类和回归问题域上的联合互信息(JMI)进行广泛的比较,证明了该方法的有效性。实验表明,该方法应用于分类问题的性能与其他方法相当,只是速度更快。在回归的情况下,它比其他方法好,但速度较慢。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号