首页> 外文期刊>IT professional >Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation
【24h】

Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation

机译:Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation

获取原文
获取原文并翻译 | 示例
       

摘要

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation of the Riccati type, this work proposes an alternative generalized adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation” (GRA). The proposed GRA function was employed on the output layer of an artificial neural network with a single hidden layer that consisted of eight neurons. The performance of the neural network was evaluated on a binary and a multiclass classification problem using different combinations of activation functions in the input/output layers. The results demonstrated that the swish/GRA combination yields higher accuracy than any other combination of activation functions. This benefit in terms of accuracy could be critical for certain domains, such as healthcare and smart grids, where AI-assisted decisions are becoming essential.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号