...
【24h】

Learning the dynamics of embedded clauses

机译:学习嵌入式条款的动态

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Recent work by Siegelmann has shown that the computational power of recurrent neural networks matches that of Turing Machines. One important implication is that complex language classes (infinite languages with embedded clauses) can be represented in neural networks. Proofs are based on a fractal encoding of states to simulate the memory and operations of stacks. In the present work, it is shown that similar stack-like dynamics can be learned in recurrent neural networks from simple sequence prediction tasks. Two main types of network solutions are found and described qualitatively as dynamical systems: damped oscillation and entangled spiraling around fixed points. The potential and limitations of each solution type are established in terms of generalization on two different context-free languages. Both solution types constitute novel stack implementations - generally in line with Siegelmann's theoretical work - which supply insights into how embedded structures of languages can be handled in analog hardware. [References: 21]
机译:西格尔曼(Siegelmann)的最新工作表明,递归神经网络的计算能力与图灵机(Turing Machines)相当。一个重要的含义是,复杂的语言类(具有嵌入子句的无限语言)可以在神经网络中表示。证明基于状态的分形编码,以模拟堆栈的存储和操作。在目前的工作中,表明可以从简单的序列预测任务中在递归神经网络中学习类似的堆栈状动力学。找到了两种主要类型的网络解决方案,并将其定性描述为动力学系统:阻尼振荡和围绕固定点的纠缠螺旋。每种解决方案类型的潜力和局限性是根据对两种不同的上下文无关语言的概括确定的。两种解决方案类型都构成了新颖的堆栈实现-通常与Siegelmann的理论工作相符-提供了有关如何在模拟硬件中处理语言的嵌入式结构的见解。 [参考:21]

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号