首页> 外文会议>IEEE International Conference on Data Mining >Improving Deep Forest by Confidence Screening
【24h】

Improving Deep Forest by Confidence Screening

机译:通过信心筛选改善深林

获取原文

摘要

Most studies about deep learning are based on neural network models, where many layers of parameterized nonlinear differentiable modules are trained by backpropagation. Recently, it has been shown that deep learning can also be realized by non-differentiable modules without backpropagation training called deep forest. The developed representation learning process is based on a cascade of cascades of decision tree forests, where the high memory requirement and the high time cost inhibit the training of large models. In this paper, we propose a simple yet effective approach to improve the efficiency of deep forest. The key idea is to pass the instances with high confidence directly to the final stage rather than passing through all the levels. We also provide a theoretical analysis suggesting a means to vary the model complexity from low to high as the level increases in the cascade, which further reduces the memory requirement and time cost. Our experiments show that the proposed approach achieves highly competitive predictive performance with significantly reduced time cost and memory requirement by up to one order of magnitude.
机译:有关深度学习的大多数研究都基于神经网络模型,其中通过反向传播训练了许多层参数化的非线性可微模块。最近,研究表明,无需反向传播训练(称为“深林”),也可以通过不可微分模块来实现深度学习。发达的表示学习过程基于决策树森林的级联级联,其中高内存需求和高时间成本阻碍了大型模型的训练。在本文中,我们提出了一种简单而有效的方法来提高深林的效率。关键思想是将高可信度的实例直接传递到最终阶段,而不是通过所有级别。我们还提供了理论分析,提出了一种方法,可以随着级联级别的增加而将模型复杂度从低变到高,从而进一步降低了内存需求和时间成本。我们的实验表明,所提出的方法可实现高度竞争的预测性能,同时将时间成本和内存需求显着降低多达一个数量级。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号