首页> 外文期刊>Journal of Mathematical Analysis and Applications >On the Global Convergence of a Class of Functional Differential Equations with Applications in Neural Network Theory
【24h】

On the Global Convergence of a Class of Functional Differential Equations with Applications in Neural Network Theory

机译:一类泛函微分方程的全局收敛性及其在神经网络理论中的应用

获取原文
获取原文并翻译 | 示例
           

摘要

We study a system of retarded functional differential equations which generalise both the Hopfield neural network model as well as hybrid network models of the cellular neural network type. Our main results give sufficient conditions for the global asymptotic stability of such systems and are milder than previously known conditions for the hybrid models. When specialised to neural networks, our models allow us to consider several different types of activation functions, including piecewise linear sigmoids and unbounded activations as well as the usual C'-smooth sigmoids. These issues are vital in the applications. We also study neural network models with nonconstant delays r(t).
机译:我们研究了延迟泛函微分方程系统,该系统推广了Hopfield神经网络模型以及细胞神经网络类型的混合网络模型。我们的主要结果为此类系统的全局渐近稳定性提供了充分的条件,并且比混合模型的先前已知条件要温和。当专门用于神经网络时,我们的模型允许我们考虑几种不同类型的激活函数,包括分段线性S型和无界激活以及通常的C'平滑S型。这些问题对于应用程序至关重要。我们还研究了具有非恒定延迟r(t)的神经网络模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号