首页> 外文期刊>Neurocomputing >PFLU and FPFLU: Two novel non-monotonic activation functions in convolutional neural networks
【24h】

PFLU and FPFLU: Two novel non-monotonic activation functions in convolutional neural networks

机译:pflu和fpflu:卷积神经网络中的两种新型非单调激活功能

获取原文
获取原文并翻译 | 示例

摘要

The choice of activation functions in Convolutional Neural Networks (CNNs) is very important. Rectified Linear Unit (ReLU) has been widely-used in most CNNs. Recently, a series of non-monotonic activation functions gradually become the new standard to enhance performance of CNNs. Inspired by them, this paper firstly proposes a novel non-monotonic activation function called Power Function Linear Unit (PFLU). The negative part of PFLU is non-monotonic and closer to zero with the negative input decreasing, which can maintain sparsity of the negative part while introducing negative activation values and non-zero derivative values for the negative part. The positive part of PFLU does not use identity mapping but is closer to identity mapping with the positive input increasing, which can bring non-linearity property for the positive part. Next, this paper proposes faster PFLU (FPFLU). A wide range of classification experiments show that PFLU tends to work better than current state-of-the-art non-monotonic activation functions, and FPFLU can run faster than most non-monotonic activation functions. (C) 2020 Elsevier B.V. All rights reserved.
机译:卷积神经网络(CNNS)中的激活功能的选择非常重要。整流的线性单元(Relu)在大多数CNN中被广泛使用。最近,一系列非单调激活功能逐渐成为增强CNN性能的新标准。这篇论文首先提出了一种名为功率函数线性单元(PFLU)的新型非单调激活功能。 PFLU的负部分是非单调和更接近零,具有负输入减小,这可以保持负部分的稀疏性,同时引入负部分的负激活值和非零导数值。 PFLU的正面部分不使用身份映射,但与正输入增加的相同映射更接近标识映射,这可以为正部件带来非线性属性。接下来,本文提出了更快的pflu(fpflu)。各种分类实验表明,PFLU倾向于优于当前最先进的非单调激活功能,而FPFLU可以比大多数非单调激活功能更快地运行。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2021年第14期|110-117|共8页
  • 作者单位

    Nanchang Univ Sch Informat Engn Nanchang 330031 Jiangxi Peoples R China;

    Nanchang Univ Sch Software Nanchang 330047 Jiangxi Peoples R China|Jiangxi Key Lab Smart City Nanchang 330047 Jiangxi Peoples R China;

    Nanchang Univ Sch Informat Engn Nanchang 330031 Jiangxi Peoples R China;

    Nanchang Univ Sch Informat Engn Nanchang 330031 Jiangxi Peoples R China;

    Nanchang Univ Sch Informat Engn Nanchang 330031 Jiangxi Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Convolutional Neural Network (CNN); Activation function; ReLU; Power Function Linear Unit (PFLU); Faster PFLU (FPFLU);

    机译:卷积神经网络(CNN);激活功能;Relu;功率功能线性单元(PFLU);更快的pflu(fpflu);

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号