首页> 外文会议>International Conference on Emerging Applications of Information Technology >Batch Normalization in Convolutional Neural Networks — A comparative study with CIFAR-10 data
【24h】

Batch Normalization in Convolutional Neural Networks — A comparative study with CIFAR-10 data

机译:卷积神经网络中的批量标准化 - CIFAR-10数据的比较研究

获取原文

摘要

Deep learning is an emerging field of computational science that involves large quantity of data for training a model. In this paper, we have performed a comparative study of various state-of-the-art Convolutional Networks viz. DenseNet, VGG, Inception (v3) Network and Residual Network with different activation function, and demonstrate the importance of Batch Normalization. It is shown that Batch Normalization is not only important in improving the performance of the neural networks, but are essential for being able to train a deep convolutional networks. In this work state-ofthe-art convolutional neural networks viz. DenseNet, VGG, Residual Network and Inception (v3) Network are compared on a standard dataset, CIFAR-10 with batch normalization for 200 epochs. The conventional RELU activation results in accuracy of 82.68%, 88.79%, 81.01%, and 84.92% respectively. With ELU activation Residual and VGG Networks' performance increases to 84.59% and 89.91%, this is highly significant.
机译:深度学习是一种新兴的计算科学领域,涉及培训模型的大量数据。在本文中,我们已经对各种最先进的卷积网络VIZ进行了比较研究。 DENSENET,VGG,初始(V3)网络和残余网络具有不同的激活功能,并展示了批量标准化的重要性。结果表明,批量归一化在提高神经网络的性能方面不仅重要,而且对于能够培训深度卷积网络至关重要。在这项工作中最艺术卷积神经网络的viz。在标准数据集,CIFAR-10上比较了DenSenet,VGG,剩余网络和成立(V3)网络,具有200时的批量标准化。传统的Relu活化可分别导致82.68%,88.79 %,81.01 %和84.92 %的准确率。通过ELU激活残差和VGG网络的性能增加到84.59 %和89.91 %,这是非常重要的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号