首页> 外文会议>Workshop on fact extraction and VERification >Unsupervised Natural Question Answering with a Small Model
【24h】

Unsupervised Natural Question Answering with a Small Model

机译:小模型的无监督自然问答

获取原文

摘要

The recent demonstration of the power of huge language models such as GPT-2 to memorise the answers to factoid questions raises questions about the extent to which knowledge is being embedded directly within these large models. This short paper describes an architecture through which much smaller models can also answer such questions - by making use of 'raw' external knowledge. The contribution of this work is that the methods presented here rely on unsupervised learning techniques, complementing the unsupervised training of the Language Model. The goal of this line of research is to be able to add knowledge explicitly, without extensive training.
机译:最近展示的诸如GPT-2之类的巨大语言模型的强大功能可以记住对类事实问题的答案,这引发了有关将知识直接嵌入这些大型模型中的程度的疑问。这篇简短的文章描述了一种体系结构,通过这种体系结构,较小的模型也可以通过利用“原始”外部知识来回答此类问题。这项工作的贡献在于,此处介绍的方法依赖于无监督的学习技术,是对语言模型无监督训练的补充。该研究领域的目标是能够在无需大量培训的情况下明确地添加知识。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号