首页> 外文会议>Computer analysis of images and patterns. >Decision Trees Using the Minimum Entropy-of-Error Principle
【24h】

Decision Trees Using the Minimum Entropy-of-Error Principle

机译:使用最小误差熵原理的决策树

获取原文
获取原文并翻译 | 示例

摘要

Binary decision trees based on univariate splits have traditionally employed so-called impurity functions as a means of searching for the best node splits. Such functions use estimates of the class distributions. In the present paper we introduce a new concept to binary tree design: instead of working with the class distributions of the data we work directly with the distribution of the errors originated by the node splits. Concretely, we search for the best splits using a minimum entropy-of-error (MEE) strategy. This strategy has recently been applied in other areas (e.g. regression, clustering, blind source separation, neural network training) with success. We show that MEE trees are capable of producing good results with often simpler trees, have interesting generalization properties and in the many experiments we have performed they could be used without pruning.
机译:传统上,基于单变量拆分的二叉决策树采用所谓的杂质函数作为搜索最佳节点拆分的手段。这样的函数使用类分布的估计。在本文中,我们为二叉树设计引入了一个新概念:与其处理数据的类分布,不如直接处理由节点拆分产生的错误的分布。具体而言,我们使用最小错误熵(MEE)策略搜索最佳分割。该策略最近已成功应用于其他领域(例如回归,聚类,盲源分离,神经网络训练)。我们证明了MEE树能够与通常较简单的树产生良好的结果,具有有趣的泛化特性,并且在我们进行的许多实验中,无需修剪即可使用它们。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号