首页> 外文会议>Conference on computational natural language learning >In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes
【24h】

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

机译:结论不是重复:基于行列式点过程的具有注意力分散的综合抽象总结

获取原文

摘要

Various Seq2Seq learning models designed for machine translation were applied for abstractive summarization task recently. Despite these models provide high ROUGE scores, they are limited to generate comprehensive summaries with a high level of abstraction due to its degenerated attention distribution. We introduce Diverse Convolutional Seq2Seq Model(DivCNN Seq2Seq) using Determinantal Point Processes methods(Micro DPPs and Macro DPPs) to produce attention distribution considering both quality and diversity. Without breaking the end to end architecture, Di-vCNN Seq2Seq achieves a higher level of comprehensiveness compared to vanilla models and strong baselines. All the reproducible codes and datasets are available online~1.
机译:最近,为机器翻译而设计的各种Seq2Seq学习模型被用于抽象总结任务。尽管这些模型提供了较高的ROUGE评分,但由于其退化的注意力分布,它们仅限于生成具有高抽象水平的综合摘要。我们介绍了使用确定性点过程方法(微DPP和宏DPP)的多元卷积Seq2Seq模型(DivCNN Seq2Seq),以考虑质量和多样性来产生注意力分布。在不破坏端到端架构的情况下,与原始模型和强大的基准相比,Di-vCNN Seq2Seq可以实现更高的综合性。所有可重现的代码和数据集都可以在线获得〜1。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号