首页> 外文会议>Seventh joint conference on lexical and computational semantics >Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories
【24h】

Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories

机译:混合上下文粒度以改进跨实体类别的问答数据的实体链接

获取原文
获取原文并翻译 | 示例

摘要

The first stage of every knowledge base question answering approach is to link entities in the input question. We investigate entity linking in the context of a question answering task and present a jointly optimized neural architecture for entity mention detection and entity disambiguation that models the surrounding context on different levels of granularity. We use the Wikidata knowledge base and available question answering datasets to create benchmarks for entity linking on question answering data. Our approach outperforms the previous state-of-the-art system on this data, resulting in an average 8% improvement of the final score. We further demonstrate that our model delivers a strong performance across different entity categories.
机译:每种知识库问题解答方法的第一步都是将输入问题中的实体链接起来。我们调查问题解答任务的上下文中的实体链接,并提出了一种针对实体提及检测和实体歧义消除的联合优化神经架构,该架构在不同粒度级别上对周围上下文进行建模。我们使用Wikidata知识库和可用的问题回答数据集来创建用于问题回答数据上的实体链接的基准。在此数据上,我们的方法优于以前的最新系统,最终得分平均提高了8%。我们进一步证明了我们的模型在不同实体类别中均具有出色的性能。

著录项

  • 来源
  • 会议地点 New Orleans(US)
  • 作者

    Daniil Sorokin; Iryna Gurevych;

  • 作者单位

    Ubiquitous Knowledge Processing Lab (UKP) and Research Training Group AIPHES Department of Computer Science, Technische Universitat Darmstadt;

    Ubiquitous Knowledge Processing Lab (UKP) and Research Training Group AIPHES Department of Computer Science, Technische Universitat Darmstadt;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号