首页> 外文期刊>Neurocomputing >Deep neural networks with Elastic Rectified Linear Units for object recognition
【24h】

Deep neural networks with Elastic Rectified Linear Units for object recognition

机译:具有弹性整流线性单元的深层神经网络,用于物体识别

获取原文
获取原文并翻译 | 示例

摘要

Rectified Linear Unit (ReLU) is crucial to the recent success of deep neural networks (DNNs). In this paper, we propose a novel Elastic Rectified Linear Unit (EReLU) that focuses on processing the positive part of input. Unlike previous variants of ReLU that typically adopt linear or piecewise linear functions to represent the positive part, EReLU is characterized by that each positive value scales within a moderate range like a spring during training stage. On test time, EReLU becomes standard ReLU. EReLU improves model fitting with no extra parameters and little overfitting risk. Furthermore, we propose Elastic Parametric Rectified Linear Unit (EPReLU) by taking advantage of EReLU and parametric ReLU (PReLU). EPReLU is able to further improve the performance of networks. In addition, we present a new training strategy to train DNNs with EPReLU. Experiments on four benchmarks including CIFAR10, CIFAR10, SVHN and ImageNet 2012 demonstrate the effectiveness of both EReLU and EPReLU. (C) 2017 Elsevier B.V. All rights reserved.
机译:整流线性单元(ReLU)对于深度神经网络(DNN)的近期成功至关重要。在本文中,我们提出了一种新颖的弹性整流线性单元(EReLU),其重点是处理输入的正部分。与以前通常采用线性或分段线性函数表示正部分的ReLU变体不同,EReLU的特征在于,每个正值在训练阶段像弹簧一样在中等范围内缩放。在测试时间,EReLU成为标准ReLU。 EReLU改进了模型拟合,而没有额外的参数,并且几乎没有过度拟合的风险。此外,我们利用EReLU和参数ReLU(PReLU)提出了弹性参数校正线性单位(EPReLU)。 EPReLU能够进一步提高网络性能。此外,我们提出了一种新的训练策略,以使用EPReLU训练DNN。在包括CIFAR10,CIFAR10,SVHN和ImageNet 2012在内的四个基准上进行的实验证明了EReLU和EPReLU的有效性。 (C)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号