首页> 中文期刊> 《传感器与微系统》 >ReLU激活函数优化研究

ReLU激活函数优化研究

     

摘要

门控循环单元(GRU)是一种改进型的长短期记忆模型(LSTM)结构,有效改善了LSTM训练耗时的缺点.在GRU的基础上,对激活函数sigmoid,tanh,ReLU等性能进行了比较和研究,详细分析了几类激活函数的优缺点,提出了一种新的激活函数双曲正切线性单元(TLU).实验证明:新的激活函数既能显著地加快深度神经网络的训练速度,又有效降低训练误差.%Gated recurrent unit(GRU)is an improved long short term memory model(LSTM)architecture,it is effective to improve training time-consuming features of LSTM. Performance of some activation functions such as sigmoid tanh,rectified linear units(ReLU)are compared and researched on the basis of GRU architecture and analyze their advantages and disadvantages in detail. Propose a novel activation function named tanh linear unit (TLU). The experiment shows that the new activation function can not only speed up training speed of deep neural networks,but also effectively reduce training error.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号