首页> 外文OA文献 >Predicting Embedded Syntactic Structures from Natural Language Sentences with Neural Network Approaches
【2h】

Predicting Embedded Syntactic Structures from Natural Language Sentences with Neural Network Approaches

机译:运用神经网络方法从自然语言句子预测嵌入式句法结构

摘要

Syntactic parsing is a key component of natural language understanding and, traditionally, has a symbolic output. Recently, a new approach for predicting syntactic structures from sentences has emerged: directly producing small and expressiveudvectors that embed in syntactic structures. In this approach, parsing produces distributed representations. In this paper, we advance the frontier of these novel predictors by using the learning capabilities of neural networks. We propose twoudapproaches for predicting the embedded syntactic structures. The first approach is based on a multi-layer perceptron to learn how to map vectors representing sentences into embedded syntactic structures. The second approach exploits recurrent neural networks with long short-term memory (LSTM-RNN-DRP) to directly map sentences to these embedded structures. We show that both approaches successfully exploit word information to learn syntactic predictors and achieve audsignificant performance advantage over previous methods. Results on the Penn Treebank corpus are promising. With the LSTM-RNN-DRP, we improve the previous state-of-the-art method by 8.68%.
机译:语法分析是自然语言理解的关键组成部分,传统上具有符号输出。最近,出现了一种从句子中预测句法结构的新方法:直接产生嵌入在句法结构中的小的,表达性 udvector。用这种方法,解析产生分布式表示。在本文中,我们通过使用神经网络的学习能力来推进这些新颖的预测变量的前沿。我们提出了两种预言法来预测嵌入的句法结构。第一种方法基于多层感知器,以学习如何将代表句子的向量映射到嵌入式句法结构中。第二种方法利用具有长短期记忆(LSTM-RNN-DRP)的递归神经网络将句子直接映射到这些嵌入式结构。我们表明,这两种方法都成功地利用了单词信息来学习语法预测器,并且比以前的方法获得了显着的性能优势。 Penn Treebank语料库的结果令人鼓舞。使用LSTM-RNN-DRP,我们将先前的最新方法改进了8.68%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号