首页> 外文会议>Conference on empirical methods in natural language processing >Evaluating the Ability of LSTMs to Learn Context-Free Grammars
【24h】

Evaluating the Ability of LSTMs to Learn Context-Free Grammars

机译:评估LSTMS学习无背景语法的能力

获取原文

摘要

While long short-term memory (LSTM) neural net architectures are designed to capture sequence information, human language is generally composed of hierarchical structures. This raises the question as to whether LSTMs can learn hierarchical structures. We explore this question with a well-formed bracket prediction task using two types of brackets modeled by an LSTM.Demonstrating that such a system is learnable by an LSTM is the first step in demonstrating that the entire class of CFLs is also learnable. We observe that the model requires exponential memory in terms of the number of characters and embedded depth, where a sub-linear memory should suffice.Still, the model does more than memorize the training input. It learns how to distinguish between relevant and irrelevant information. On the other hand, we also observe that the model does not generalize well.We conclude that LSTMs do not learn the relevant underlying context-free rules, suggesting the good overall performance is attained rather by an efficient way of evaluating nuisance variables. LSTMs are a way to quickly reach good results for many natural language tasks, but to understand and generate natural language one has to investigate other concepts that can make more direct use of natural language's structural nature.
机译:虽然长期内存(LSTM)神经网络架构被设计用于捕获序列信息,但人类语言通常由分层结构组成。这提出了对LSTMS是否可以学习分层结构的问题。我们使用由LSTM.DemonStrating建模的两种括号进行了良好的括号预测任务来探索这个问题,即DemonStration的两种括号,这样的系统是由LSTM学习的第一步是表明整类CFL也是学习的。我们观察到该模型在字符数和嵌入式深度方面需要指数存储器,其中子线性存储器应该足够.still,该模型比记住训练输入。它学会如何区分相关和无关信息。另一方面,我们也观察到该模型并不概括。我们得出结论,LSTM不会学习相关的无关的无背景规则,表明良好的整体性能是通过评估滋扰变量的有效方法实现的。 LSTMS是一种快速达到许多自然语言任务的良好结果的方法,但要理解和生成自然语言,必须调查其他可以更加直接地使用自然语言的结构性质的概念。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号