首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Two Discourse Driven Language Models for Semantics
【24h】

Two Discourse Driven Language Models for Semantics

机译:两种语篇驱动语言语义模型

获取原文

摘要

Natural language understanding often requires deep semantic knowledge. Expanding on previous proposals, we suggest that some important aspects of semantic knowledge can be modeled as a language model if done at an appropriate level of abstraction. We develop two distinct models that capture semantic frame chains and discourse information while abstracting over the specific mentions of predicates and entities. For each model, we investigate four implementations: a "standard" N-gram language model and three discriminatively trained "neural" language models that generate embeddings for semantic frames. The quality of the semantic language models (SemLM) is evaluated both intrinsically, using perplexity and a narrative cloze test and extrinsically - we show that our SemLM helps improve performance on semantic natural language processing tasks such as co-reference resolution and discourse parsing.
机译:自然语言理解通常需要深入的语义知识。在对先前的建议进行扩展的基础上,我们建议,如果在适当的抽象级别进行操作,则可以将语义知识的一些重要方面建模为语言模型。我们开发了两个截然不同的模型,这些模型捕获语义框架链和话语信息,同时抽象化谓词和实体的特定提法。对于每种模型,我们研究了四种实现:“标准” N-gram语言模型和三种经过区分训练的“神经”语言模型,这些模型生成语义框架的嵌入。语义语言模型(SemLM)的质量在内部,使用困惑和叙述性完形填空测试中进行了评估,并且在外部进行了评估-我们证明了我们的SemLM有助于提高语义自然语言处理任务(例如,共同引用解析和语篇解析)的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号