首页> 外文会议>Conference on computational natural language learning >In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes
【24h】

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

机译:总之,不重复:基于决定性点过程的多元化关注综合抽象总结

获取原文

摘要

Various Seq2Seq learning models designed for machine translation were applied for abstractive summarization task recently. Despite these models provide high ROUGE scores, they are limited to generate comprehensive summaries with a high level of abstraction due to its degenerated attention distribution. We introduce Diverse Convolutional Seq2Seq Model(DivCNN Seq2Seq) using Determinantal Point Processes methods(Micro DPPs and Macro DPPs) to produce attention distribution considering both quality and diversity. Without breaking the end to end architecture, Di-vCNN Seq2Seq achieves a higher level of comprehensiveness compared to vanilla models and strong baselines. All the reproducible codes and datasets are available online~1.
机译:最近应用于机器翻译设计的各种SEQ2SEQ学习型号,用于最近应用抽象摘要任务。尽管这些模型提供了高胭脂得分,但由于其退化的关注分布,它们有限地产生具有高度抽象的全面摘要。我们使用测定点工艺方法(Micro DPP和宏DPP)来引入多样化的卷积SEQ2SEQ模型(DIVCNN SEQ2Seq),以产生考虑质量和多样性的关注分布。在没有破坏端到端架构的情况下,与香草模型和强的基线相比,Di-VCNN SEQ2Seq实现了更高的全面性。所有可重复的代码和数据集都可以在线获得〜1。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号