...
首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >On the global output convergence of a class of recurrent neural networks with time-varying inputs.
【24h】

On the global output convergence of a class of recurrent neural networks with time-varying inputs.

机译:一类具有时变输入的递归神经网络的全局输出收敛性。

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the present results which extend the existing ones.
机译:本文研究了一类具有全局Lipschitz连续和单调非递减激活函数以及局部Lipschitz连续时变输入的递归神经网络的全局输出收敛。我们为此类神经网络的全局输出收敛建立了两个充分条件。在连接权重矩阵中的对称性在当前结果中是不需要的,该结果扩展了现有结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号