首页> 外文会议>European Conference on Information Retrieval >Answer Sentence Selection Using Local and Global Context in Transformer Models
【24h】

Answer Sentence Selection Using Local and Global Context in Transformer Models

机译:在变压器模型中使用本地和全局上下文选择句子选择

获取原文

摘要

An essential task for the design of Question Answering systems is the selection of the sentence containing (or constituting) the answer from documents relevant to the asked question. Previous neural models have experimented with using additional text together with the target sentence to learn a selection function but these methods were not powerful enough to effectively encode contextual information. In this paper, we analyze the role of contextual information for the sentence selection task in Transformer based architectures, leveraging two types of context, local and global. The former describes the paragraph containing the sentence, aiming at solving implicit references, whereas the latter describes the entire document containing the candidate sentence, providing content-based information. The results on three different benchmarks show that the combination of the local and global context in a Transformer model significantly improves the accuracy in Answer Sentence Selection.
机译:问题应答系统的重要任务是选择包含(或构成)与所要求的问题相关的答案的句子。 以前的神经模型已经尝试使用附加文本与目标句子一起学习选择功能,但这些方法不足以有效地编码上下文信息。 在本文中,我们分析了基于变压器的体系结构中的句子选择任务的语境信息的角色,利用了两种类型的上下文,本地和全局。 前者描述了包含句子的段落,旨在解决隐式参考,而后者描述了包含候选句子的整个文档,提供基于内容的信息。 三种不同基准的结果表明,变压器模型中的本地和全球背景的组合显着提高了答复句选择中的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号