首页> 外文会议>International Conference on Machine Learning, Optimization, and Data Science >Stochastic Weight Matrix-Based Regularization Methods for Deep Neural Networks
【24h】

Stochastic Weight Matrix-Based Regularization Methods for Deep Neural Networks

机译:基于随机重量矩阵的深神经网络的正则化方法

获取原文

摘要

The aim of this paper is to introduce two widely applicable regularization methods based on the direct modification of weight matrices. The first method, Weight Reinitialization, utilizes a simplified Bayesian assumption with partially resetting a sparse subset of the parameters. The second one, Weight Shuffling, introduces an entropy- and weight distribution-invariant non-white noise to the parameters. The latter can also be interpreted as an ensemble approach. The proposed methods are evaluated on benchmark datasets, such as MNIST, CIFAR-10 or the JSB Chorales database, and also on time series modeling tasks. We report gains both regarding performance and entropy of the analyzed networks. We also made our code available as a GitHub repository (https://github.com/rpatrik96/lod-wmm-2019).
机译:本文的目的是根据重量矩阵的直接修改引入两个广泛适用的正则化方法。第一种方法,重量重新初始化,利用简化的贝叶斯假设,通过部分地重置参数的稀疏子集。第二个重量混洗,将熵和重量分布的非白噪声引入参数上。后者也可以被解释为集合方法。所提出的方法在基准数据集(例如MNIST,CIFAR-10或JSB Chorales数据库)上进行评估,以及时间序列建模任务。我们报告有关分析的网络的性能和熵的收益。我们还将代码作为GitHub存储库(https://github.com/rpatrik96/lod-wmm -2019)提供。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号