首页> 外文期刊>Quality Control, Transactions >Text Summarization Method Based on Double Attention Pointer Network
【24h】

Text Summarization Method Based on Double Attention Pointer Network

机译:基于双重注意指针网络的文本摘要方法

获取原文
获取原文并翻译 | 示例
           

摘要

A good document summary should summarize the core content of the text. Research on automatic text summarization attempts to solve this problem. The encoder-decoder model is widely used in text summarization research. Soft attention is used to obtain the required contextual semantic information during decoding. However, due to the lack of access to the key features, the generated summary deviates from the core content. In this paper, we proposed an encoder-decoder model based on a double attention pointer network (DAPT). In DAPT, the self-attention mechanism collects key information from the encoder, the soft attention and the pointer network generate more coherent core content, and the fusion of both generates accurate and coherent summaries. In addition, the improved coverage mechanism is used to address the repetition problem and improve the quality of the generated summaries. Simultaneously, scheduled sampling and reinforcement learning (RL) are combined to generate new training methods to optimize the model. Experiments on the CNN/Daily Mail dataset and the LCSTS dataset show that our model performs as well as many state-of-the-art models. The experimental analysis shows that our model achieves higher summarization performance and reduces the occurrence of repetition.
机译:一个好的文件摘要应该总结文本的核心内容。自动文本摘要试图解决这个问题的研究。编码器 - 解码器模型广泛用于文本摘要研究。软件被注意在解码期间获得所需的上下文语义信息。但是,由于缺乏对关键特征的访问,所生成的摘要偏离了核心内容。在本文中,我们提出了一种基于双重注意指针网络(DAPT)的编码器 - 解码器模型。在DAPT中,自我关注机制从编码器中收集关键信息,软关注和指针网络产生更加连贯的核心内容,两者的融合产生准确和连贯的摘要。此外,改进的覆盖机制用于解决重复问题并提高所产生的摘要的质量。同时,将计划的采样和增强学习(RL)组合以生成新的培训方法来优化模型。 CNN /日邮件数据集的实验和LCSTS DataSet显示我们的模型表现以及最先进的模型。实验分析表明,我们的模型实现了更高的总结性能,并减少了重复的发生。

著录项

  • 来源
    《Quality Control, Transactions》 |2020年第2020期|11279-11288|共10页
  • 作者单位

    Guangxi Normal Univ Guangxi Key Lab Multi Source Informat Min & Secur Guilin 541004 Peoples R China;

    Guangxi Normal Univ Guangxi Key Lab Multi Source Informat Min & Secur Guilin 541004 Peoples R China;

    Guangxi Normal Univ Guangxi Key Lab Multi Source Informat Min & Secur Guilin 541004 Peoples R China;

    Guangxi Normal Univ Guangxi Key Lab Multi Source Informat Min & Secur Guilin 541004 Peoples R China;

    Northwest Normal Univ Coll Comp Sci & Engn Lanzhou 730070 Peoples R China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Attention mechanism; neural networks; pointer network; text summarization;

    机译:注意机制;神经网络;指针网络;文本摘要;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号