首页> 外文期刊>International Journal of Artificial Intelligence & Applications (IJAIA) >Extractive Summarization with Very Deep Pretrained Language Model
【24h】

Extractive Summarization with Very Deep Pretrained Language Model

机译:利用非常深的预用语言模型进行提取综准

获取原文
           

摘要

Recent development of generative pretrained language models has been proven very successful on a widerange of NLP tasks, such as text classification, question answering, textual entailment and so on.In thiswork, we present a two-phase encoder decoder architecture based on Bidirectional EncodingRepresentation from Transformers(BERT) for extractive summarization task. We evaluated our model byboth automatic metrics and human annotators, and demonstrated that the architecture achieves the state-of-the-art comparable result on large scale corpus - CNN/Daily Mail 1 . As the best of our knowledge, thisis the first work that applies BERT based architecture to a text summarization task and achieved the state-of-the-art comparable result.
机译:最近的发电机预训练语言模型的发展已经证明是非常成功的NLP任务,如文本分类,问题应答,文本征询等。在这个方案中,我们介绍了一个基于双向编码的两阶段编码器解码器体系结构变压器(BERT)采用进取摘要任务。我们评估了我们的型号的自动指标和人类注册商,并证明了该建筑在大规模语料库上实现了最先进的结果 - CNN /每日邮寄1。作为我们所知的最佳知识,这是第一个将基于BERT的架构应用于文本摘要任务的工作,并实现了最先进的可比结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号