首页> 外文会议>International Joint Conference on Neural Networks >Computational capabilities of recurrent neural networks based on their attractor dynamics
【24h】

Computational capabilities of recurrent neural networks based on their attractor dynamics

机译:基于吸引子动力学的递归神经网络的计算能力

获取原文
获取外文期刊封面目录资料

摘要

We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and output cells as well as sigmoid internal cells. When subjected to some infinite binary input stream, the Boolean output cells necessarily exhibit some attractor dynamics, which is assumed to be of two possible kinds, namely either meaningful or spurious, and which underlies the arising of spatiotemporal patterns of output discharges. In this context, we show that rational-weighted neural networks are computationally equivalent to deterministic Muller Turing machines, whereas all other models of real-weighted or evolving neural networks are equivalent to each other, and strictly more powerful than deterministic Muller Turing machines. In this precise sense, the analog and evolving neural networks are super-Turing. We further provide some precise mathematical characterization of the expressive powers of all these neural models. These results constitute a generalization to the current computational context of those obtained in the cases of classical as well as interactive computations. They support the idea that recurrent neural networks represent a natural model of computation beyond the Turing limits.
机译:我们考虑由布尔输入和输出单元以及S形内部单元组成的所谓混合递归神经网络模型。当布尔输出单元受到某些无限的二进制输入流的影响时,它们必然会表现出一些吸引子动力学,这些动力学被认为是两种可能的类型,即有意义的或虚假的,并且构成了输出放电的时空模式的基础。在这种情况下,我们表明,有理加权神经网络在计算上等同于确定性Muller Turing机器,而所有其他实数加权或演化神经网络模型彼此等效,并且比确定性Muller Turing机器更为强大。从这个精确的意义上讲,模拟神经网络和进化神经网络是超级图灵的。我们进一步为所有这些神经模型的表达能力提供了一些精确的数学表征。这些结果构成了在经典计算和交互式计算的情况下所获得的那些计算结果的当前计算上下文的概括。他们支持这样的想法,即递归神经网络代表了超越图灵极限的自然计算模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号