首页> 外文期刊>Neurocomputing >An exponential-enhanced-type varying-parameter RNN for solving time-varying matrix inversion
【24h】

An exponential-enhanced-type varying-parameter RNN for solving time-varying matrix inversion

机译:求解时变矩阵求逆的指数增强型变参数RNN

获取原文
获取原文并翻译 | 示例
           

摘要

In order to compute time-varying matrix inversion faster, a novel exponential-enhanced-type varyingparameter recurrent neural network (EVP-RNN) is proposed and investigated in this paper. First, the detailed design process of the proposed EVP-RNN is stated and presented. Then, mathematical analysis proves that the proposed EVP-RNN has superior exponential convergence property than the conventional fixed-parameter recurrent neural network (FP-RNN) with four kinds of specific activation functions. Meanwhile, the guideline of choosing an activation function is provided to achieve a better convergence property. Third, theoretical analysis shows that the upper-bounds of calculation error of EVP-RNN are always smaller than those of FP-RNN and actual calculation error of EVP-RNN always converges faster than that of the FP-RNN. Simultaneously, an idea of designing a time-varying parameter is given. Finally, the results of comparative simulations verify the effectiveness, high accuracy, and superiority of the EVP-RNN compared with the traditional FP-RNN for solving time-varying matrix inversion. (c) 2019 Elsevier B.V. All rights reserved.
机译:为了更快地计算时变矩阵求逆,本文提出并研究了一种新型的指数增强型可变参数递归神经网络(EVP-RNN)。首先,陈述并介绍了拟议的EVP-RNN的详细设计过程。然后,通过数学分析证明,所提出的EVP-RNN具有优于传统的具有四种特定激活函数的固定参数递归神经网络(FP-RNN)的指数收敛性。同时,提供了选择激活函数的准则以实现更好的收敛性。第三,理论分析表明,EVP-RNN的计算误差上限总是小于FP-RNN的上限,并且EVP-RNN的实际计算误差总是比FP-RNN收敛快。同时给出了设计时变参数的思想。最后,比较仿真的结果证明了与传统的FP-RNN相比,EVP-RNN解决时变矩阵求逆的有效性,高精度和优越性。 (c)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号