首页> 外文会议>Intelligent Robots and Computer Vision X: Neural, Biological, and 3-D Methods >Efficient activation functions for the back-propagation neural network
【24h】

Efficient activation functions for the back-propagation neural network

机译:反向传播神经网络的有效激活函数

获取原文
获取原文并翻译 | 示例

摘要

Abstract: The back-propagation algorithm is the most common algorithm in use in artificial neural network research. The standard activation (transfer) function is the logistic function s(x) $EQ 1/(1 $PLU exp($MIN@x)). The derivative of this function is used in correcting the error signals for updating the coefficients of the network. The maximum value of the derivative is only 0.25, which yields slow convergence. A new family of activation functions is proposed, whose derivatives belong to Sech$+n$/ (x) family for n $EQ 1,2,.... The maximum value of the derivatives varies from 0.637 to 1.875 for n $EQ 1-6, and thus a member of the activation function-family can be selected to suit the problem. Results of using this family of activation functions show orders of magnitude savings in computation. A discrete version of these functions is also proposed for efficient implementation. For the parity 8 problem with 16 hidden units, the new activation function f$-3$/ uses 300 epochs for learning when compared to 500,000 epochs used by the standard activation function.!22
机译:摘要:反向传播算法是人工神经网络研究中最常用的算法。标准激活(转移)函数是逻辑函数s(x)$ EQ 1 /(1 $ PLU exp($ MIN @ x))。该函数的导数用于校正误差信号以更新网络的系数。导数的最大值仅为0.25,这会导致收敛缓慢。提出了一个新的激活函数族,对于n $ EQ 1,2,...,其导数属于Sech $ + n $ /(x)族。对于n $ EQ,导数的最大值从0.637到1.875不等因此,如图1-6所示,因此可以选择激活功能家族的成员以适合该问题。使用该系列激活函数的结果表明在计算中节省了数量级。还提出了这些功能的离散版本,以实现高效实现。对于具有16个隐藏单元的奇偶校验8问题,新的激活函数f $ -3 $ /使用300个纪元进行学习,而标准激活函数使用500,000个纪元。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号