...
首页> 外文期刊>Neural processing letters >Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function
【24h】

Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function

机译:通过使用符号双幂激活函数将递归神经网络加速为有限时间收敛的时变西尔维斯特方程

获取原文
获取原文并翻译 | 示例
           

摘要

Bartels-Stewart algorithm is an effective and widely used method with an O(n~3) time complexity for solving a static Sylvester equation. When applied to time-varying Sylvester equation, the computation burden increases intensively with the decrease of sampling period and cannot satisfy continuous realtime calculation requirements. Gradient-based recurrent neural network are able to solve the time-varying Sylvester equation in real time but there always exists an estimation error. In contrast, the recently proposed Zhang neural network has been proven to converge to the solution of the Sylvester equation ideally when time goes to infinity. However, this neural network with the suggested activation functions never converges to the desired value in finite time, which may limit its applications in realtime processing. To tackle this problem, a sign-bi-power activation function is proposed in this paper to accelerate Zhang neural network to finite-time convergence. The global convergence and finite-time convergence property are proven in theory. The upper bound of the convergence time is derived analytically. Simulations are performed to evaluate the performance of the neural network with the proposed activation function. In addition, the proposed strategy is applied to online calculating the pseudo-inverse of a matrix and nonlinear control of an inverted pendulum system. Both theoretical analysis and numerical simulations validate the effectiveness of proposed activation function.
机译:Bartels-Stewart算法是一种有效且广泛使用的方法,具有O(n〜3)的时间复杂度,用于求解静态Sylvester方程。当应用于时变的Sylvester方程时,计算负担随着采样周期的减少而集中增加,并且不能满足连续的实时计算要求。基于梯度的递归神经网络能够实时求解时变的Sylvester方程,但始终存在估计误差。相比之下,最近提出的张神经网络被证明可以在时间到无穷远时理想地收敛到Sylvester方程的解。但是,具有建议的激活函数的该神经网络永远不会在有限时间内收敛到所需值,这可能会限制其在实时处理中的应用。为了解决这个问题,本文提出了一种符号双幂激活函数,以将张神经网络加速到有限时间收敛。理论上证明了全局收敛性和有限时间收敛性。收敛时间的上限是通过分析得出的。进行仿真以评估具有建议的激活函数的神经网络的性能。另外,将所提出的策略应用于在线计算矩阵的伪逆和倒立摆系统的非线性控制。理论分析和数值模拟都验证了所提出的激活函数的有效性。

著录项

相似文献

  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号