首页> 外文期刊>Neural processing letters >Global Exponential Stability and Global Convergence in Finite Time of Neural Networks with Discontinuous Activations
【24h】

Global Exponential Stability and Global Convergence in Finite Time of Neural Networks with Discontinuous Activations

机译:具有不连续激活的神经网络在有限时间内的全​​局指数稳定性和全局收敛性

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we consider a general class of neural networks, which have arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global exponential stability and global convergence in finite time of these delayed neural networks. Under these conditions the uniqueness of initial value problem (1VP) is proved. The exponential convergence rate can be quantitatively estimated on the basis of the parameters defining the neural network. These conditions are easily testable and independent of the delay. In the end some remarks and examples are discussed to compare the present results with the existing ones.
机译:在本文中,我们考虑了一类通用的神经网络,它们在神经元互连中具有任意恒定的延迟,并且神经元激活属于不连续单调递增和(可能)无界函数的集合。基于拓扑度理论和李雅普诺夫泛函方法,我们为这些延迟神经网络在有限时间内的全​​局指数稳定性和全局收敛性提供了一些新的充分条件。在这些条件下,证明了初值问题(1VP)的唯一性。指数收敛速度可以基于定义神经网络的参数进行定量估计。这些条件易于测试,并且与延迟无关。最后讨论了一些注释和示例,以将当前结果与现有结果进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号