首页> 外文会议>International conference on Asian language processing >Attentive Siamese LSTM Network for Semantic Textual Similarity Measure
【24h】

Attentive Siamese LSTM Network for Semantic Textual Similarity Measure

机译:周到的暹罗LSTM网络用于语义文本相似性度量

获取原文

摘要

Semantic Textual Similarity (STS) is important for many applications such as Plagiarism Detection (PD), Text Paraphrasing and Information Retrieval (IR). Current methods for STS rely on statistical machine learning. Recent studies showed that neural networks for STS presented promising experimental results. In this paper, we propose an Attentive Siamese Long Short-Term Memory (LSTM) network for measuring Semantic Textual Similarity. Instead of external resources and handcraft features, raw sentence pairs and pre-trained word embedding are needed as input. Attention mechanism is utilized in LSTM network to capture high-level semantic information. We demonstrated the effectiveness of our model by applying the architecture in different tasks: three corpora and three language tasks. Experimental results on all tasks and languages show that our method with attention mechanism outperforms the baseline model with a higher correlation with human annotation.
机译:语义文本相似性(STS)对于诸如抄袭检测(PD),文本释义和信息检索(IR)之类的许多应用是重要的。 STS依赖统计机器学习的当前方法。最近的研究表明,STS的神经网络提出了有前途的实验结果。在本文中,我们提出了一种用于测量语义文本相似性的细心暹罗长的短期记忆(LSTM)网络。不需要外部资源和手动功能,原始句子对和预先训练的单词嵌入作为输入。 LSTM网络中使用注意机制来捕获高级语义信息。我们通过在不同的任务中应用架构来展示我们模型的有效性:三个语料库和三种语言任务。所有任务和语言的实验结果表明,我们的注意机制方法优于基线模型,与人类注释具有更高的相关性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号