首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Searching for Effective Neural Extractive Summarization: What Works and What's Next
【24h】

Searching for Effective Neural Extractive Summarization: What Works and What's Next

机译:寻找有效的神经动力摘要:有效和下一个有效

获取原文
获取外文期刊封面目录资料

摘要

The recent years have seen remarkable success in the use of deep neural networks on text summarization. However, there is no clear understanding of why they perform so well, or haw they might be improved. In this paper, we seek to better understand how neural extractive summarization systems could benefit from different types of model architectures, transferable knowledge and learning schemas. Additionally, we find an effective way to improve current frameworks and achieve the state-of-the-art result on CNN/DailyMail by a large margin based on our observations and analyses. Hopefully, our work could provide more clues for future research on extractive summarization. Source code will be available on Github~1.
机译:近年来在使用深度神经网络上的文本摘要方面取得了显着成功。但是,没有明确的了解为什么他们表现得如此良好,或者他们可能会得到改善。在本文中,我们寻求更好地了解神经抽取摘要系统如何从不同类型的模型架构中受益,可转让的知识和学习模式。此外,我们发现了一种有效的方法来提高当前框架,并根据我们的观察和分析,通过大的利润来实现最新的CNN / Dailymail的结果。希望我们的工作可以为未来的提取综准化研究提供更多线索。源代码将在github〜1上可用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号