首页> 外文期刊>AI communications >A heuristic for learning decision trees and pruning them into classification rules
【24h】

A heuristic for learning decision trees and pruning them into classification rules

机译:一种用于学习决策树并将其修剪成分类规则的启发式方法

获取原文
获取原文并翻译 | 示例
       

摘要

Let us consider a set of training examples described by continuous or symbolic attributes with categorical classes. In this paper we present a measure of the potential quality of a region of the attribute space to be represented as a rule condition to classify unseen cases. The aim is to take into account the distribution of the classes of the examples. The resulting measure, called impurity level, is inspired by a similar measure used in the instance-based algorithm IB3 for selecting suitable paradigmatic exemplars that will classify, in a nearest-neighbor context, future cases. The features of the impurity level are illustrated using a version of Quinlan's well-known C4.5 where the information-based heuristics are replaced by our measure. The experiments carried out to test the proposals indicate a very high accuracy reached with sets of classification rules as small as those found by RIPPER.
机译:让我们考虑用分类类别的连续或符号属性描述的一组训练示例。在本文中,我们提出了一种对属性空间区域的潜在质量的度量,该度量将被表示为分类未见案例的规则条件。目的是考虑示例类别的分布。所得的度量(称为杂质水平)受到基于实例的算法IB3中使用的类似度量的启发,该度量用于选择合适的范式样本,这些样本将在最近邻居的情况下进行分类。使用Quinlan著名的C4.5版本说明了杂质水平的特征,其中基于信息的启发式方法已由我们的方法代替。为测试提议而进行的实验表明,使用与RIPPER所发现的分类规则一样小的分类规则集,可以达到很高的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号