【24h】

Decorrelated Batch Normalization

机译:装饰相关的批次标准化

获取原文

摘要

Batch Normalization (BN) is capable of accelerating the training of deep models by centering and scaling activations within mini-batches. In this work, we propose Decorrelated Batch Normalization (DBN), which not just centers and scales activations but whitens them. We explore multiple whitening techniques, and find that PCA whitening causes a problem we call stochastic axis swapping, which is detrimental to learning. We show that ZCA whitening does not suffer from this problem, permitting successful learning. DBN retains the desirable qualities of BN and further improves BN's optimization efficiency and generalization ability. We design comprehensive experiments to show that DBN can improve the performance of BN on multilayer perceptrons and convolutional neural networks. Furthermore, we consistently improve the accuracy of residual networks on CIFAR-10, CIFAR-100, and ImageNet.
机译:批处理规范化(BN)能够通过在小批处理中集中和缩放激活来加速深度模型的训练。在这项工作中,我们提出了与装饰相关的批处理规范化(DBN),它不仅可以对激活进行居中和缩放,还可以使其白化。我们探索了多种增白技术,发现PCA增白会导致一个称为随机轴交换的问题,这对学习不利。我们显示ZCA美白不会遭受此问题的困扰,可以成功学习。 DBN保留了BN所需的质量,并进一步提高了BN的优化效率和泛化能力。我们设计了综合实验,表明DBN可以改善BN在多层感知器和卷积神经网络上的性能。此外,我们不断提高CIFAR-10,CIFAR-100和ImageNet上残差网络的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号