首页> 外文会议>Workshop on Knowledge Extraction and Integration for Deep Learning Architectures >Incorporating Commonsense Knowledge Graph in Pretrained Models for Social Commonsense Tasks
【24h】

Incorporating Commonsense Knowledge Graph in Pretrained Models for Social Commonsense Tasks

机译:在预用模型中纳入勤杂朗语知识图,以获得社会致辞任务

获取原文

摘要

Pretrained language models have excelled at many NLP tasks recently; however, their social intelligence is still unsatisfactory. To enable this, machines need to have a more general understanding of our complicated world and develop the ability to perform common-sense reasoning besides fitting the specific downstream tasks. External commonsense knowledge graphs (KGs), such as ConceptNet, provide rich information about words and their relationships. Thus, towards general common-sense learning, we propose two approaches to implicitly and explicitly infuse such KGs into pretrained language models. We demonstrate our proposed methods perform well on So-ciallQA, a social commonsense reasoning task, in both limited and full training data regimes.
机译:最近预先用的语言模型卓越的NLP任务;但是,他们的社会智慧仍然不满意。为了实现这一点,机器需要更加一般地了解我们复杂的世界,并且除了拟合特定的下游任务之外,能够执行常识推理的能力。外部勤义知识图(kgs),例如概念网络,提供有关单词及其关系的丰富信息。因此,朝着一般的常识学习,我们提出了两种方法,隐含地并将这些kg解析为预磨料的语言模型。我们展示了我们所提出的方法在有限和完整的培训数据制度中对So-CiallQA进行良好的So-CiallQA,这是一个社会型号推理任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号