首页> 外文期刊>IEEE Transactions on Circuits and Systems. I, Regular Papers >Sufficient and Necessary Conditions for Global Exponential Stability of Discrete-Time Recurrent Neural Networks
【24h】

Sufficient and Necessary Conditions for Global Exponential Stability of Discrete-Time Recurrent Neural Networks

机译:离散时间递归神经网络的全局指数稳定性的充要条件

获取原文
获取原文并翻译 | 示例
           

摘要

A set of sufficient and necessary conditions are presented for global exponential stability (GES) of a class of generic discrete-time recurrent neural networks. By means of the uncovered conditions, GES and convergence properties of the neural networks are analyzed quantitatively. It is shown that exact equivalences exist among the GES property of the neural networks, the contractiveness of the deduced nonlinear operators, and the global asymptotic stability (GAS) of the neural networks plus the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point less than one. When the neural networks have small state feedback coefficients, it is shown further that the in-fimum of exponential bounds of the trajectories of the neural networks equals exactly the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point. The obtained results are helpful in understanding essence of GES and clarifying difference between GES and GAS of the discrete-time recurrent neural networks.
机译:为一类通用离散时间递归神经网络的全局指数稳定性(GES)提出了一组充分必要的条件。通过发现的条件,对神经网络的GES和收敛性进行了定量分析。结果表明,神经网络的GES性质,推论的非线性算子的收缩性,神经网络的全局渐近稳定性(GAS)以及神经网络的雅可比矩阵的谱半径在唯一位置之间存在精确的等价关系。平衡点小于一。当神经网络的状态反馈系数较小时,进一步表明,神经网络的轨迹的指数界的最小值恰好等于该神经网络在唯一平衡点处雅可比矩阵的谱半径。所得结果有助于理解GES的本质,并阐明离散时间递归神经网络的GES和GAS之间的差异。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号