首页> 外文学位 >Adaptive dropout for training deep neural networks.
【24h】

Adaptive dropout for training deep neural networks.

机译:用于训练深度神经网络的自适应辍学。

获取原文
获取原文并翻译 | 示例

摘要

Recently, it was shown that deep neural networks perform very well if the activities of hidden units are regularized during learning, e.g, by randomly dropping out 50% of their activities. We describe a method called "standout" in which a binary belief network is overlaid on a neural network and is used to regularize of its hidden units by selectively setting activities to zero. This "adaptive dropout network" can be trained jointly with the neural network by approximately computing local expectations of binary dropout variables and computing derivatives using back-propagation. Interestingly, experiments suggest that a good dropout network regularizes activities according to magnitude. When evaluated on the MNIST and NORB datasets, we found that our method achieves lower classification error rates than other feature learning methods, including standard dropout, and RBM. We also present the discriminative learning results using our method on the MNIST, NORB and CIFAR-10 datasets.
机译:最近,研究表明,如果在学习过程中将隐蔽单元的活动规范化(例如通过随机丢弃其活动的50%),则深度神经网络将表现出色。我们描述了一种称为“脱颖而出”的方法,其中将二进制置信网络覆盖在神经网络上,并通过有选择地将活动设置为零来用于对其隐藏单元进行正则化。可以通过近似计算二进制辍学变量的局部期望并使用反向传播来计算导数,从而与神经网络一起训练该“自适应辍学网络”。有趣的是,实验表明,一个良好的辍学网络会根据规模来规范活动。在MNIST和NORB数据集上进行评估时,我们发现与其他特征学习方法(包括标准辍学和RBM)相比,我们的方法实现的分类错误率更低。我们还将在MNIST,NORB和CIFAR-10数据集上使用我们的方法呈现有区别的学习结果。

著录项

  • 作者

    Ba, Jimmy Lei.;

  • 作者单位

    University of Toronto (Canada).;

  • 授予单位 University of Toronto (Canada).;
  • 学科 Artificial intelligence.;Statistics.;Computer science.
  • 学位 M.A.S.
  • 年度 2014
  • 页码 32 p.
  • 总页数 32
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号