首页> 外文会议>International Conference on Biomedical Engineering and Informatics >A leave-one-feature-out wrapper method for feature selection in data classification
【24h】

A leave-one-feature-out wrapper method for feature selection in data classification

机译:一种用于数据分类中特征选择的留一特征包装器方法

获取原文

摘要

Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.
机译:在过去的几十年中,特征选择一直是活跃的研究领域。特征选择的目的包括提高预测准确性,加快分类速度以及更好地了解特征。特征选择方法通常分为三类:过滤器方法,包装器方法和嵌入式方法。在本文中,我们提出了一种用于特征选择的简单的留一特征退出包装器方法。主要目标是提高预测准确性。我们方法的一个显着特征是交叉验证训练的次数是用户控制的特征数的恒定倍数。该策略可以应用于任何分类器,并且思路直观。考虑到现成的机器学习软件包的广泛可用性和计算能力,提出的简单方法可能对从业者特别有吸引力。数值实验表明了该方法的简单用法和有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号