...
首页> 外文期刊>Neurocomputing >Improving abstractive summarization based on dynamic residual network with reinforce dependency
【24h】

Improving abstractive summarization based on dynamic residual network with reinforce dependency

机译:基于动态残余网络的增强依赖性提高抽象总结

获取原文
获取原文并翻译 | 示例
           

摘要

The Seq2Seq abstract summarization model based on long short-term memory (LSTM) is very effective for short text summarization. However, LSTM is limited by long-term dependencies, which can potentially result in salient information loss when long text is processed by the Seq2Seq model based on LSTM. To overcome the long-term dependence limitation, an encoder-decoder model based on the dynamic residual network is proposed in this work. The model can dynamically select an optimal state from the state history to establish a connection with the current state to improve the LSTM long sequence dependencies according to the current decoding environment. Because the dynamic residual connections will result in long-term connection-dependent words, a new method based on reinforcement learning is proposed to simulate the dependence between words, which is then implemented into the training process of the model. This model is verified using the CNN/Daily Mail and New York Times datasets, and the experimental results show that the proposed model achieves significant improvements in capturing longterm dependencies compared with the traditional LSTM-based Seq2Seq abstractive summarization model.& nbsp; (c) 2021 Elsevier B.V. All rights reserved.
机译:SEQ2SEQ摘要基于长短期内存(LSTM)的摘要模型对于短文本摘要非常有效。然而,LSTM受长期依赖性的限制,当通过基于LSTM的SEQ2SEQ模型处理长文本时,可能会导致突出的信息丢失。为了克服长期依赖性限制,在这项工作中提出了一种基于动态残差网络的编码器 - 解码器模型。该模型可以从状态历史动态地选择最佳状态,以根据当前解码环境建立与当前状态的连接以提高LSTM长序列依赖性。由于动态残差连接将导致长期连接依赖性的单词,因此提出了一种基于加强学习的新方法来模拟词语之间的依赖,然后将其实现为模型的培训过程。该模型使用CNN /每日邮件和纽约时报数据集进行验证,实验结果表明,与传统的基于LSTM的SEQ2Seq抽象摘要模型相比,该建议的模型实现了捕获长期依赖性的显着改进。  (c)2021 elestvier b.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号