首页> 外文会议>European conference on IR research >Attention-Based Neural Text Segmentation
【24h】

Attention-Based Neural Text Segmentation

机译:基于注意力的神经文本分割

获取原文

摘要

Text segmentation plays an important role in various Natural Language Processing (NLP) tasks like summarization, context understanding, document indexing and document noise removal. Previous methods for this task require manual feature engineering, huge memory requirements and large execution times. To the best of our knowledge, this paper is the first one to present a novel supervised neural approach for text segmentation. Specifically, we propose an attention-based bidirectional LSTM model where sentence embeddings are learned using CNNs and the segments are predicted based on contextual information. This model can automatically handle variable sized context information. Compared to the existing competitive baselines, the proposed model shows a performance improvement of ~7% in WinDiff score on three benchmark datasets.
机译:文本分段在各种自然语言处理(NLP)任务(例如摘要,上下文理解,文档索引和文档噪声消除)中起着重要作用。用于此任务的先前方法需要手动进行功能设计,巨大的内存需求和大量的执行时间。据我们所知,本文是第一个提出一种新颖的监督式神经网络文本分割方法。具体来说,我们提出了一种基于注意力的双向LSTM模型,其中使用CNN学习句子嵌入,并根据上下文信息预测句段。该模型可以自动处理可变大小的上下文信息。与现有的竞争基准相比,所提出的模型在三个基准数据集上的WinDiff得分显示了约7%的性能提升。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号