【24h】

Selective Encoding for Abstractive Sentence Summarization

机译:抽象句摘要的选择性编码

获取原文

摘要

We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the information flow from encoder to decoder. The second level representation is tailored for sentence summarization task, which leads to better performance. We evaluate our model on the English Gigaword, DUC 2004 and MSR abstractive sentence summarization datasets. The experimental results show that the proposed selective encoding model outperforms the state-of-the-art baseline models.
机译:我们提出了一种选择性编码模型,以扩展序列到序列框架以进行抽象句子摘要。它由一个句子编码器,一个选择性门网络和一个配备注意功能的解码器组成。句子编码器和解码器使用递归神经网络构建。选择门网络通过控制从编码器到解码器的信息流来构造第二级句子表示。第二级表示是为句子摘要任务量身定制的,从而导致更好的性能。我们在英语Gigaword,DUC 2004和MSR抽象句子摘要数据集中评估我们的模型。实验结果表明,所提出的选择性编码模型优于最新的基线模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号