【24h】

iDropout: Leveraging Deep Taylor Decomposition for the Robustness of Deep Neural Networks

机译:iDropout:利用深度泰勒分解实现深度神经网络的鲁棒性

获取原文
获取外文期刊封面目录资料

摘要

In this work, we present iDropout, a new method to adjust dropout, from purely randomly dropping inputs to dropping inputs based on a mix based on the relevance of the nodes and some randomness. We use Deep Taylor Decomposition to calculate the respective relevance of the inputs and based on this, we give input nodes with a higher relevance a higher probability to be included than input nodes that seem to have less of an impact. The proposed method does not only seem to increase the performance of a Neural Network, but it also seems to make the network more robust to missing data. We evaluated the approach on artificial data with various settings, e.g. noise in data, number of informative features and on real-world datasets from the UCI Machine Learning Repository.
机译:在这项工作中,我们介绍了iDropout,它是一种调整丢失的新方法,它可以从纯随机删除输入到基于基于节点相关性和某种随机性的混合来删除输入。我们使用深度泰勒分解法来计算输入的各个相关性,并据此为相关性较高的输入节点提供比看起来影响较小的输入节点更高的包含概率。所提出的方法不仅看起来可以提高神经网络的性能,而且还可以使网络对丢失的数据更加健壮。我们评估了具有各种设置的人工数据的方法,例如UCI机器学习存储库中的数据,信息功能的数量以及现实数据集上的噪声。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号