首页> 外文期刊>Knowledge-Based Systems >Joint knowledge-powered topic level attention for a convolutional text summarization model
【24h】

Joint knowledge-powered topic level attention for a convolutional text summarization model

机译:联合知识供电的主题级别关注卷积文本摘要模型

获取原文
获取原文并翻译 | 示例

摘要

A Abstractive text summarization (ATS) often fails to capture salient information and preserve the original meaning of the content in the generated summaries due to a lack of background knowledge. We present a method to provide the topic information based on the background knowledge of documents to a deep learning-based summarization model. This method comprises a topic knowledge base (TKB) and convolutional sequence network-based text summarization model with knowledge-powered topic level attention (KTOPAS). TKB employs conceptualization to retrieve the semantic salient knowledge of documents and the knowledge-powered topic model (KPTopicM) to generate coherent and meaningful topic information by utilizing the knowledge that represents the documents well. KTOPAS obtains knowledge-powered topic information (also called topic knowledge) from TKB and incorporates the topic knowledge into the convolutional sequence network through a high-level topic level attention to resolve the existing issues in ATS. KTOPAS introduces a tri-attention channel to jointly learn the attention of the source elements over the summary elements, the source elements over topic knowledge, and topic knowledge over the summary elements to present the contextual alignment information from three aspects and combine them using the softmax function to generate the final probability distribution which enables the model to produce coherent, concise, and humanlike summaries with word diversity. By conducting experiments on datasets, namely CNN/Daily Mail and Gigaword, the results show that our proposed method consistently outperforms the competing baselines. Moreover, TKB improves the effectiveness of the resulting summaries by providing topic knowledge to KTOPAS and demonstrates the quality of the proposed method. (C) 2021 Elsevier B.V. All rights reserved.
机译:抽象文本摘要(ATS)通常无法捕获突出的信息,并由于缺乏背景知识而在所产生的摘要中保留内容的原始含义。我们提出了一种基于对基于深度学习的摘要模型的文档的背景知识提供主题信息。该方法包括主题知识库(TKB)和基于卷积序列网络的文本摘要模型,具有知识供电的主题级别关注(KTOPAS)。 TKB采用概念化来检索文档的语义突出知识和知识供电的主题模型(kptopicm),通过利用良好代表文档的知识来生成连贯和有意义的主题信息。 KTOPAS从TKB获取知识供电的主题信息(也称为主题知识),并通过高级主题级别注意,将主题知识包含在卷积序列网络中,以解决ATS中的现有问题。 KTOPAS介绍了一种三关注频道,共同学习源元素的注意力,通过摘要元素,主题知识的源元素,以及通过摘要元素的主题知识从三个方面呈现上下文对齐信息,并使用softmax组合它们函数生成最终概率分布,其使模型能够产生与单词多样性产生连贯的,简洁和人类的摘要。通过对数据集进行实验,即CNN /每日邮件和Gigaword,结果表明我们的提出方法始终如一地优于竞争基线。此外,TKB通过向KTOPAS提供主题知识并展示所提出的方法的质量来提高所产生的摘要的有效性。 (c)2021 elestvier b.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号