【24h】

Bert is Not All You Need for Commonsense Inference

机译:伯特不是你所需要的代言人推断

获取原文

摘要

This paper studies the task of commonsense inference, especially natural language inference (NLI) and causal inference (CI), requiring knowledge beyond what is stated in the input sentences. State-of-the-arts have been neural models powered with knowledge or contextual embeddings, for example BERT, as commonsenses knowledge. Our research questions are thus: Is BERT all we need for NLI and CI? If not, what is missing information and where to find such information? While many work has studied what is captured in BERT, the limitation of BERT is rather under-studied. Our contribution is observing the limitations of BERT in commonsense inference, then leveraging complementary resources containing missing information. Specifically, we model BERT and complementary resource as two heterogeneous modalities, and explore the pros and cons of multimodal integration approaches. We demonstrate that our proposed integration models achieve the state-of-the-art performance on both NLI and CI tasks.
机译:本文研究了致辞引用,特别是自然语言推理(NLI)和因果推断(CI)的任务,要求超出输入句子中所述的知识。 最先进的是神经模型,具有知识或语境嵌入,例如伯特,作为兼顾知识。 因此,我们的研究问题是:NLI和CI所需的一切都是伯特吗? 如果没有,那么什么是缺少信息以及在哪里找到此类信息? 虽然许多工作已经研究了伯特捕获的东西,但伯特的限制是相当沉沦的。 我们的贡献是观察伯特在致辞中推断的局限性,然后利用包含缺失信息的互补资源。 具体而言,我们将BERT和互补资源模拟为两个异质方式,探索多模式集成方法的优缺点。 我们展示了我们所提出的集成模型在NLI和CI任务中实现了最先进的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号