首页> 外文会议>International Conference on Intelligent Computing and Intelligent Systems >Mutual Information Based on Renyi's Entropy Feature Selection
【24h】

Mutual Information Based on Renyi's Entropy Feature Selection

机译:基于仁怡熵特征选择的互信息

获取原文

摘要

Feature selection problem has become the focus of much pattern classification research and mutual information is more and more important in the feature selection algorithms. We proposed normalized mutual information based on Renyi's quadratic entropy feature selection, which reduces the computational complexity, relying on the efficient estimation of the mutual information. Then we combine NMIFS with wrappers into a two-stage feature selection algorithm. This helps us find more charactering feature subset. We perform some experiments to compare the efficiency and classification accuracy to other MI-based feature selection algorithm. Results show that our method leads to promising improvement on computation complexity.
机译:特征选择问题已成为大量模式分类的焦点,在特征选择算法中互信息越来越重要。我们提出了基于瑞尼的二次熵特征选择的标准化互信息,这降低了计算复杂性,依赖于互联信息的有效估计。然后,我们将NMIF与包装器组合成两级特征选择算法。这有助于我们找到更多特征子集。我们执行一些实验,以将效率和分类精度与其他基于MI的特征选择算法进行比较。结果表明,我们的方法导致了对计算复杂性的提高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号