首页> 外文会议>Annual Meeting of the Association for Computational Linguistics >The Summary Loop: Learning to Write Abstractive Summaries Without Examples
【24h】

The Summary Loop: Learning to Write Abstractive Summaries Without Examples

机译:摘要循环:学习无需举例就可以写出抽象的摘要

获取原文

摘要

This work presents a new approach to unsu-pervised abstractive summarization based on maximizing a combination of coverage and fluency for a given length constraint. It introduces a novel method that encourages the inclusion of key terms from the original document into the summary: key terms are masked out of the original document and must be filled in by a coverage model using the current generated summary. A novel unsupervised training procedure leverages this coverage model along with a fluency model to generate and score summaries. When tested on popular news summarization datasets, the method outperforms previous unsupervised methods by more than 2 R-1 points, and approaches results of competitive supervised methods. Our model attains higher levels of abstraction with copied passages roughly two times shorter than prior work, and learns to compress and merge sentences without supervision.
机译:这项工作提出了一种新的无监督摘要方法,该方法基于在给定长度约束下最大化覆盖率和流利度的组合。它引入了一种新的方法,鼓励将原始文档中的关键术语包含到摘要中:关键术语被隐藏在原始文档中,并且必须由使用当前生成的摘要的覆盖模型填充。一种新颖的无监督培训程序利用覆盖率模型和流利度模型生成总结并打分。在热门新闻摘要数据集上进行测试时,该方法比以前的无监督方法高出2个R-1点以上,接近竞争监督方法的结果。我们的模型获得了更高层次的抽象,复制的段落比之前的工作短了大约两倍,并且学会了在没有监督的情况下压缩和合并句子。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号