首页> 外文会议>International Conference on Pattern Recognition >FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
【24h】

FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks

机译:FReLU:用于改善卷积神经网络的灵活整流线性单元

获取原文

摘要

Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. However, because of the zero-hard rectification, ReLU networks lose the benefits from negative values. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU) to further explore the effects of negative values. By redesigning the rectified point of ReLU as a learnable parameter, FReLU expands the states of the activation output. When a network is successfully trained, FReLU tends to converge to a negative value, which improves the expressiveness and thus the performance. Furthermore, FReLU is designed to be simple and effective without exponential functions to maintain low-cost computation. For being able to easily used in various network architectures, FReLU does not rely on strict assumptions by self-adaption. We evaluate FReLU on three standard image classification datasets, including CIFAR-10, CIFAR-100, and ImageNet. Experimental results show that FReLU achieves fast convergence and competitive performance on both plain and residual networks.
机译:整流线性单位(ReLU)是深度卷积神经网络广泛使用的激活函数。但是,由于零硬整流,ReLU网络失去了负值带来的好处。在本文中,我们提出了一种新的激活函数,称为柔性整流线性单位(FReLU),以进一步探讨负值的影响。通过将ReLU的整流点重新设计为可学习的参数,FReLU扩展了激活输出的状态。成功训练网络后,FReLU趋向于收敛为负值,从而提高了表示能力,从而提高了性能。此外,FReLU被设计为简单而有效,而无需使用指数函数来维持低成本的计算。为了能够轻松地在各种网络体系结构中使用,FReLU不依赖于自适应的严格假设。我们在三个标准图像分类数据集中评估FReLU,包括CIFAR-10,CIFAR-100和ImageNet。实验结果表明,FReLU在普通网络和残差网络上均实现了快速收敛和竞争性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号