首页> 外文期刊>Neurocomputing >Incremental filter and wrapper approaches for feature discretization
【24h】

Incremental filter and wrapper approaches for feature discretization

机译:用于特征离散化的增量式过滤器和包装器方法

获取原文
获取原文并翻译 | 示例
           

摘要

Discrete data representations are necessary, or at least convenient, in many machine learning problems. While feature selection (FS) techniques aim at finding relevant subsets of features, the goal of feature discretization (FD) is to find concise (quantized) data representations, adequate for the learning task at hand. In this paper, we propose two incremental methods for FD. The first method belongs to the filter family, in which the quality of the discretization is assessed by a (supervised or unsupervised) relevance criterion. The second method is a wrapper, where discretized features are assessed using a classifier. Both methods can be coupled with any static (unsupervised or supervised) discretization procedure and can be used to perform FS as pre-processing or post-processing stages. The proposed methods attain efficient representations suitable for binary and multi-class problems with different types of data, being competitive with existing methods. Moreover, using well-known FS methods with the features discretized by our techniques leads to better accuracy than with the features discretized by other methods or with the orieinal features.
机译:在许多机器学习问题中,离散数据表示是必需的,或者至少是方便的。特征选择(FS)技术旨在查找特征的相关子集,而特征离散化(FD)的目标是找到适合手边学习任务的简洁(量化)数据表示形式。在本文中,我们提出了两种用于FD的增量方法。第一种方法属于过滤器族,其中离散化的质量通过(有监督或无监督)相关性标准进行评估。第二种方法是包装器,其中使用分类器评估离散特征。两种方法都可以与任何静态(无监督或无监督)离散化过程耦合,并且可以用作执行FS的预处理或后处理阶段。所提出的方法获得了适用于具有不同类型数据的二进制和多类问题的有效表示,与现有方法相比具有竞争力。此外,与通过其他方法离散化的特征或原始特征相比,将众所周知的FS方法与我们的技术离散化的特征结合使用可带来更高的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号