首页> 外文期刊>Knowledge and Data Engineering, IEEE Transactions on >Pruning Incremental Linear Model Trees with Approximate Lookahead
【24h】

Pruning Incremental Linear Model Trees with Approximate Lookahead

机译:用近似前瞻修剪增量线性模型树

获取原文
获取原文并翻译 | 示例

摘要

Incremental linear model trees with approximate lookahead are fast, but produce overly large trees. This is due to non-optimal splitting decisions boosted by a possibly unlimited number of examples obtained from a data source. To keep the processing speed high and the tree complexity low, appropriate incremental pruning techniques are needed. In this paper, we introduce a pruning technique for the class of incremental linear model trees with approximate lookahead on stationary data sources. Experimental results show that the advantage of approximate lookahead in terms of processing speed can be further improved by producing much smaller and consequently more explanatory, less memory consuming trees on high-dimensional data. This is done at the expense of only a small increase in prediction error. Additionally, the pruning algorithm can be tuned to either produce less accurate model trees at a much higher processing speed or, alternatively, more accurate trees at the expense of higher processing times.
机译:具有近似前瞻性的增量线性模型树速度很快,但会生成过大的树。这是由于从数据源获得的示例数量可能不受限制而导致的非最佳拆分决策。为了保持较高的处理速度和较低的树复杂度,需要适当的增量修剪技术。在本文中,我们针对固定数据源上具有近似前瞻性的增量线性模型树的类别介绍了一种修剪技术。实验结果表明,通过在高维数据上生成更小且因此更具解释性,更少内存消耗的树,可以进一步提高在处理速度方面近似提前的优势。这样做只是以很小的预测误差为代价。此外,修剪算法可以调整为以更高的处理速度生成精度较低的模型树,或者以更高的处理时间为代价来生成精度更高的树。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号