首页> 外文期刊>Circuits, systems, and signal processing >A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks
【24h】

A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

机译:一种新的基于分数梯度的递归神经网络学习算法

获取原文
获取原文并翻译 | 示例

摘要

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey-Glass chaotic time series prediction.
机译:在这项研究中,我们提出了一种用于学习递归神经网络的新颖算法,称为经过时间的分数反向传播(FBPTT)。考虑到分数演算的潜力,我们建议使用基于分数演算的梯度下降方法来推导FBPTT算法。结果表明,在非线性系统识别,模式分类和Mackey-Glass混沌时间序列预测这三个主要的估计问题上,所提出的FBPTT方法优于通过时间算法进行的传统反向传播。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号