【24h】

Broad-Coverage Semantic Parsing as Transduction

机译:大范围语义解析为转导

获取原文

摘要

We unity different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence of semantic relations. By leveraging multiple attention mechanisms, the transducer can be effectively trained without relying on a pre-trained aligner. Experiments conducted on three separate broad-coverage semantic parsing tasks - AMR, SDP and UCCA - demonstrate that our attention-based neural transducer improves the state of the art on both AMR and UCCA, and is competitive with the state of the art on SDP.
机译:我们在转导范式下统一不同的广泛性语义解析任务,并提出了一种基于注意力的神经框架,该框架通过一系列语义关系来逐步构建含义表示。通过利用多个注意机制,可以有效地训练换能器,而无需依赖预先训练的对准器。对三个单独的广泛覆盖的语义解析任务-AMR,SDP和UCCA进行的实验表明,我们基于注意力的神经传感器改进了AMR和UCCA的最新技术,并且与SDP的最新技术具有竞争力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号