...
首页> 外文期刊>Pattern recognition letters >Deterministic dropout for deep neural networks using composite random forest
【24h】

Deterministic dropout for deep neural networks using composite random forest

机译:综合随机林深神经网络的确定性辍学

获取原文
获取原文并翻译 | 示例
           

摘要

Dropout prevents overfitting in deep neural networks. Typical strategy of dropout involves random termination of connections irrespective of their importance. Termination blocks the propagation of class discriminative information across the network. As a result, dropout may lead to inferior performance. We propose a deterministic dropout where only unimportant connections are dropped ensuring propagation of class discriminative information. We identify the unimportant connections using a novel composite random forest, integrated into the network. We prove that better generalization is achieved by terminating these unimportant connections. The proposed algorithm is useful in preventing overfitting in noisy datasets. The proposal is equally good for datasets with smaller number of training examples. Experiments on several benchmark datasets show up to 8% improvement in classification accuracy. (c) 2020 Elsevier B.V. All rights reserved.
机译:辍学阻止了深度神经网络中的过度拟合。无论其重要性如何,辍学策略都涉及连接的随机终止。终止阻止跨网络中的类鉴别信息的传播。结果,辍学可能导致较差的性能。我们提出了一个确定性辍学,其中丢弃了不重要的连接,确保了类别辨别信息的传播。我们使用集成到网络中的新型复合随机林来确定不重要的连接。我们证明,通过终止这些不重要的连接来实现更好的概括。该算法在防止噪声数据集中的过度拟合是有用的。该提案同样适用于具有较少培训示例的数据集。几个基准数据集的实验显示出分类准确性的提高高达8%。 (c)2020 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号