...
首页> 外文期刊>International Journal of Computational Science and Engineering >Performance analysis of nonlinear activation function in convolution neural network for image classification
【24h】

Performance analysis of nonlinear activation function in convolution neural network for image classification

机译:卷积神经网络中非线性激活功能的性能分析,用于图像分类

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Deep learning architectures which are exceptionally deep have exhibited to be incredibly powerful models for image processing. As the architectures become deep, it introduces challenges and difficulties in the training process such as overfitting, computational cost, and exploding/vanishing gradients and degradation. A new state-of-the-art densely connected architecture, called DenseNets, has exhibited an exceptionally outstanding result for image classification. However, it still computationally costly to train DenseNets. The choice of the activation function is also an important aspect in training of deep learning networks because it has a considerable impact on the training and performance of a network model. Therefore, an empirical analysis of some of the nonlinear activation functions used in deep learning is done for image classification. The activation functions evaluated include ReLU, Leaky ReLU, ELU, SELU and an ensemble of SELU and ELU. Publicly available datasets Cifar-10, SVHN, and PlantVillage are used for evaluation.
机译:深度深度的深度学习架构表现为图像处理的令人难以置信的强大模型。随着架构变得深,它引起了培训过程中的挑战和困难,如过用,计算成本和爆炸/消失梯度和降级。一种新的最先进的密集连接架构,称为Densenets,已经表现出对图像分类的异常出色的结果。然而,它仍然在计算训练诱因时昂贵。激活功能的选择也是深度学习网络培训的一个重要方面,因为它对网络模型的培训和性能具有相当大的影响。因此,为图像分类进行了深度学习中使用的一些非线性激活函数的实证分析。评估的激活函数包括Relu,泄漏的Relu,ELU,SELU和SELU和ELU的集成。公共数据集CiFar-10,SVHN和Plantvillage用于评估。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号