首页> 外文会议>2018 Fifth International Conference on Emerging Applications of Information Technology >Batch Normalization in Convolutional Neural Networks — A comparative study with CIFAR-10 data
【24h】

Batch Normalization in Convolutional Neural Networks — A comparative study with CIFAR-10 data

机译:卷积神经网络中的批量归一化-与CIFAR-10数据的比较研究

获取原文
获取原文并翻译 | 示例

摘要

Deep learning is an emerging field of computational science that involves large quantity of data for training a model. In this paper, we have performed a comparative study of various state-of-the-art Convolutional Networks viz. DenseNet, VGG, Inception (v3) Network and Residual Network with different activation function, and demonstrate the importance of Batch Normalization. It is shown that Batch Normalization is not only important in improving the performance of the neural networks, but are essential for being able to train a deep convolutional networks. In this work state-ofthe-art convolutional neural networks viz. DenseNet, VGG, Residual Network and Inception (v3) Network are compared on a standard dataset, CIFAR-10 with batch normalization for 200 epochs. The conventional RELU activation results in accuracy of 82.68%, 88.79%, 81.01%, and 84.92% respectively. With ELU activation Residual and VGG Networks' performance increases to 84.59% and 89.91%, this is highly significant.
机译:深度学习是计算科学的新兴领域,涉及用于训练模型的大量数据。在本文中,我们对各种最新的卷积网络进行了比较研究。具有不同激活功能的DenseNet,VGG,Inception(v3)网络和残差网络,并演示了批标准化的重要性。结果表明,批量归一化不仅对改善神经网络的性能很重要,而且对于能够训练深度卷积网络至关重要。在这项工作中,最先进的卷积神经网络即。在标准数据集CIFAR-10上对DenseNet,VGG,残差网络和初始(v3)网络进行了比较,并对批次进行了200个时代的标准化。传统的RELU激活的准确度分别为82.68%,88.79%,81.01%和84.92%。通过ELU激活,Residual和VGG Networks的性能分别提高到84.59%和89.91%,这是非常重要的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号