【24h】

Data Recombination for Neural Semantic Parsing

机译:用于神经语义解析的数据重组

获取原文

摘要

Modeling crisp logical regularities is crucial in semantic parsing, making it difficult for neural models with no task-specific prior knowledge to achieve good results. In this paper, we introduce data recombination, a novel framework for injecting such prior knowledge into a model. From the training data, we induce a high-precision synchronous context-free grammar, which captures important conditional independence properties commonly found in semantic parsing. We then train a sequence-to-sequence recurrent network (RNN) model with a novel attention-based copying mechanism on datapoints sampled from this grammar, thereby teaching the model about these structural properties. Data recombination improves the accuracy of our RNN model on three semantic parsing datasets, leading to new state-of-the-art performance on the standard GeoQuery dataset for models with comparable supervision.
机译:对清晰的逻辑规则进行建模在语义解析中至关重要,这使得没有特定于任务的先验知识的神经模型很难获得良好的结果。在本文中,我们介绍了数据重组,一种将此类先验知识注入模型的新颖框架。从训练数据中,我们得出了一种高精度的同步上下文无关文法,该文法捕获了语义分析中通常发现的重要条件独立性。然后,我们使用一种新颖的基于注意力的复制机制,对从此语法中采样的数据点训练序列到序列的递归网络(RNN)模型,从而教授有关这些结构特性的模型。数据重组提高了我们在三个语义解析数据集上的RNN模型的准确性,从而为具有可比监督的模型在标准GeoQuery数据集上带来了最新的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号