首页> 外文会议>International Congress on Human-Computer Interaction, Optimization and Robotic Applications >Comparative Analysis of Activation Functions Used in the Hidden Layers of Deep Neural Networks
【24h】

Comparative Analysis of Activation Functions Used in the Hidden Layers of Deep Neural Networks

机译:深神经网络隐藏层中使用的激活功能的比较分析

获取原文

摘要

The development in the field of neural networks opens up opportunities for the use of many activation functions, each of which has its own specific features. This raises questions about how compatible the different activation functions are and whether their exchange affects the operation of a neural network. The article reviews the design, training and research of a Deep Neural Network. The Network is applied for curve recognition Three popular activation functions are analysed in the hidden layers – sigmoid function (Sigmoid), a hyperbolic tangent (tanh) and a rectified linear unit (ReLU). The results of this study will be useful in the design of Deep Neural Networks and in the selection of activation functions.
机译:神经网络领域的开发开辟了许多激活函数的机会,每个激活函数都有自己的特定功能。 这提出了关于如何兼容不同激活功能的问题以及它们的交换是否影响神经网络的操作。 本文审查了深度神经网络的设计,培训和研究。 该网络应用于曲线识别三个流行的激活功能,在隐藏层 - Sigmoid函数(Sigmoid),双曲线切线(TanH)和整流的线性单元(Relu)中进行分析。 该研究的结果将在深度神经网络设计和选择激活功能的设计中有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号