首页> 外文会议>Conference on Empirical Methods in Natural Language Processing >Entities as Experts: Sparse Memory Access with Entity Supervision
【24h】

Entities as Experts: Sparse Memory Access with Entity Supervision

机译:实体作为专家:稀疏内存访问与实体监督

获取原文

摘要

We focus on the problem of capturing declarative knowledge about entities in the learned parameters of a language model. We introduce a new model-Entities as Experts (EAE)-that can access distinct memories of the entities mentioned in a piece of text. Unlike previous efforts to integrate entity knowledge into sequence models, EaE's entity representations are learned directly from text. We show that EaE's learned representations capture sufficient knowledge to answer TriviaQA questions such as "Which Dr. Who villain has been played by Roger Delgado, Anthony Ainley, Eric Roberts? ", outperforming an encoder-generator Transformer model with 10 × the parameters. According to the LAMA knowledge probes, EaE contains more factual knowledge than a similarly sized BERT, as well as previous approaches that integrate external sources of entity knowledge. Because EaE associates parameters with specific entities, it only needs to access a fraction of its parameters at inference time, and we show that the correct identification and representation of entities is essential to EaE's performance.
机译:我们专注于在语言模型的学习参数中捕获关于实体的声明知识的问题。我们介绍一个新的模型实体作为专家(EAE) - 可以访问一篇文本中提到的实体的不同回忆。与以往的努力将实体知识集成到序列模型中,EAE的实体表示直接从文本中学到。我们展示了EAE的学识渊博的陈述捕获了足够的知识来回答TRIVIAQA问题,例如“博士·谁扮演的罗杰艾尔格多,Anthony Ainley,Eric Roberts博士?”,表现出具有10倍参数的编码器发生器变压器模型。根据喇嘛知识探测,EAE包含比类似大小的伯特更具实际知识,以及整合外部实体知识来源的先前方法。因为EAE与关联实体的具体参数,只需要在推理时间来访问它的参数的一小部分,我们表明实体的正确识别和代表性是EAE的性能至关重要。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号