首页> 外文会议>International Conference on Dependability of Computer Systems >Standard Dropout as Remedy for Training Deep Neural Networks with Label Noise
【24h】

Standard Dropout as Remedy for Training Deep Neural Networks with Label Noise

机译:标准辍学作为培训具有标签噪声的深神经网络的补救措施

获取原文

摘要

Deep neural networks, trained on large annotated datasets, are often considered as universal and easy-to-use tools for obtaining top performance on many computer vision, speech understanding, or language processing tasks. Unfortunately, these data-driven classifiers strongly depend on the quality of training patterns. Since large datasets often suffer from label noise, the results of training deep neural structures can be unreliable. In this paper, we present experimental study showing that simple regularization technique, namely dropout, improves robustness to mislabeled training data, and even in its standard version can be considered as a remedy for label noise. We demonstrate it on popular MNIST and CIFAR-10 datasets, presenting results obtained for several probabilities of noisy labels and dropout levels.
机译:在大型注释数据集上培训的深度神经网络通常被认为是在许多计算机视觉,语音理解或语言处理任务上获得最佳性能的通用和易于使用的工具。 不幸的是,这些数据驱动的分类器强烈取决于培训模式的质量。 由于大型数据集经常遭受标签噪声,因此训练深神经结构的结果可能是不可靠的。 在本文中,我们提出了实验研究,显示了简单的正则化技术,即辍学,提高了误标记的训练数据的鲁棒性,即使在其标准版本中也可以被视为标签噪声的补救措施。 我们在流行的Mnist和Cifar-10数据集上展示它,提出了噪音标签和辍学水平的几个概率的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号