首页> 外文会议>International Conference on Artificial Neural Nets and Genetic Algorithms, 2001, Prague, Czech Republic >Long Short-Term Memory Learns Context Free and Context Sensitive Languages
【24h】

Long Short-Term Memory Learns Context Free and Context Sensitive Languages

机译:长期短期记忆可学习上下文无关和上下文敏感语言

获取原文
获取原文并翻译 | 示例

摘要

Previous work on learning regular languages from exemplary training sequences showed that Long Short-Term Memory (LSTM) outperforms traditional recurrent neural networks (RNNs). Here we demonstrate LSTM's superior performance on context free language (CFL) benchmarks, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a context sensitive language (CSL), namely, a~nb~nc~n.
机译:从示例性训练序列中学习常规语言的先前工作表明,长短期记忆(LSTM)优于传统的递归神经网络(RNN)。在这里,我们展示了LSTM在上下文无关语言(CFL)基准测试中的卓越性能,并表明它比以前的硬连线或高度专业化的架构还要好。据我们所知,LSTM变体也是最早学习上下文相关语言(CSL),即a〜nb〜nc〜n的RNN。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号