首页> 外文会议>International joint conference on natural language processing;Conference on empirical methods in natural language processing >Event Causality Recognition Exploiting Multiple Annotators' Judgments and Background Knowledge
【24h】

Event Causality Recognition Exploiting Multiple Annotators' Judgments and Background Knowledge

机译:事件因果关系识别利用多个注释者的判断力和背景知识

获取原文

摘要

We propose new BERT-based methods for recognizing event causality such as "smoke cigarettes" → "die of lung cancer" written in web texts. In our methods, we grasp each annotator's policy by training multiple classifiers, each of which predicts the labels given by a single annotator. and combine the resulting classifiers' outputs to predict the final labels determined by majority vote. Furthermore, we investigate the effect of supplying background knowledge to our classifiers. Since BERT models are pre-trained with a large corpus, some sort of background knowledge for event causality may be learned during pre-training. Our experiments with a Japanese dataset suggest that this is actually the case: Performance improved when we pretrained the BERT models with web texts containing a large number of event causalities instead of Wikipedia articles or randomly sampled web texts. However, this effect was limited. Therefore, we further improved performance by simply adding texts related to an input causality candidate as background knowledge to the input of the BERT models. We believe these findings indicate a promising future research direction.
机译:我们提出了新的基于BERT的方法来识别事件因果关系,例如在网络文本中编写的“烟头”→“肺癌死亡”。在我们的方法中,我们通过训练多个分类器来掌握每个注释器的策略,每个分类器可预测单个注释器给出的标签。并结合分类器的输出结果来预测由多数投票决定的最终标签。此外,我们研究了向分类器提供背景知识的效果。由于BERT模型已通过大型语料库进行了预训练,因此在预训练过程中可能会了解事件因果关系的某种背景知识。我们用日语数据集进行的实验表明确实是这样:当我们使用包含大量事件因果关系的网络文本而不是Wikipedia文章或随机采样的网络文本来对BERT模型进行预训练时,性能得到了改善。但是,这种效果是有限的。因此,我们通过简单地将与输入因果关系候选者有关的文本作为背景知识添加到BERT模型的输入中,进一步提高了性能。我们相信这些发现表明了有前途的未来研究方向。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号