首页> 外文期刊>Neural computation >Relating the Slope of the Activation Function and the Learning Rate Within a Recurrent Neural Network
【24h】

Relating the Slope of the Activation Function and the Learning Rate Within a Recurrent Neural Network

机译:循环神经网络中激活函数的斜率与学习率的关系

获取原文
获取原文并翻译 | 示例
       

摘要

A relationship between the learning rate η in the learning algorithm, and the slope β in the nonlinear activation function, for a class of recurrent neural networks (RNNs) trained by the real-time recurrent learning algo- righm is provided. It is shown that an arbitrary RNN can be obtained vai the referent RNN, with some deterministic rules imposed on its weights and the learning rate. Such relationships reduce the number of degrees of freedom when solving the nonlinear optimization task of finding the optimal RNN parameters.
机译:对于由实时递归学习算法训练的一类递归神经网络(RNN),提供了学习算法中的学习速率η与非线性激活函数中的斜率β之间的关系。结果表明,可以通过参考RNN获得任意RNN,并对其权重和学习率强加一些确定性规则。当解决寻找最佳RNN参数的非线性优化任务时,这种关系减少了自由度的数量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号