...
首页> 外文期刊>Neurocomputing >Domain adaptive multi-task transformer for low-resource machine reading comprehension
【24h】

Domain adaptive multi-task transformer for low-resource machine reading comprehension

机译:Domain adaptive multi-task transformer for low-resource machine reading comprehension

获取原文
获取原文并翻译 | 示例
           

摘要

In recent years, low-resource Machine Reading Comprehension (MRC) attracts increasing attention. Due to the difficulty in data collecting, current low-resource MRC approaches often suffer from poor generalizing capability: the model only learns limited task-aware and domain-aware knowledge from a smallscale training dataset. Previous works generally address such deficiency by learning the required knowledge from out-of-domain MRC datasets and in-domain self-supervised datasets. However, such approaches also introduce domain noise and task noise. This paper proposes a Domain Adaptive MultiTask Transformer (DAMT2) to tackle these noises. For task noise, DAMT2 utilizes a well-designed Multi-Task Transformer (MT2) as the backbone to model the high-level features separately from different tasks. For domain noise, two kinds of domain adaptation approaches are incorporated into MT2 to learn domain-invariant representations. The experimental results show that our method outperforms several baselines on multiple datasets, and especially achieves a new SOTA on the RRC dataset. Moreover, using only 40%-60% training data, our work achieves comparable performance with the classic BERT model. (c) 2022 Elsevier B.V. All rights reserved.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号