【24h】

Deep Cascade of Extra Trees

机译:深层多余的树

获取原文

摘要

Deep neural networks have recently become popular because of their success in such domains as image and speech recognition, which has lead many to wonder whether other learners could benefit from deep, layered architectures. In this paper, we propose the Deep Cascade of Extra Trees (DCET) model. Representation learning in deep neural networks mostly relies on the layer-by-layer processing of raw features. Inspired by this, DCET uses a deep cascade of decision forests structure, where the cascade in each level receives the best feature information processed by the cascade of forests of its preceding level. Experiments show that its performance is quite robust regarding hyper-parameter settings; in most cases, even across different datasets from different domains, it is able to get excellent performance by using the same default setting.
机译:深度神经网络最近由于在图像和语音识别等领域的成功而变得流行,这使许多人想知道其他学习者是否可以从深度的分层体系结构中受益。在本文中,我们提出了“额外树的深层级联”(DCET)模型。深度神经网络中的表示学习主要依赖于原始特征的逐层处理。受此启发,DCET使用了决策森林结构的深层级联,其中每个级别的级联接收由其上一级的森林级联处理的最佳特征信息。实验表明,对于超参数设置,它的性能相当强大。在大多数情况下,即使跨不同域的不同数据集,也可以使用相同的默认设置获得出色的性能。

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号