首页> 外文会议>International conference on computational linguistics >Neural Paraphrase Generation with Stacked Residual LSTM Networks
【24h】

Neural Paraphrase Generation with Stacked Residual LSTM Networks

机译:使用堆叠式残留LSTM网络生成神经释义

获取原文

摘要

In this paper, we propose a novel neural approach for paraphrase generation. Conventional paraphrase generation methods either leverage hand-written rules and thesauri-based alignments, or use statistical machine learning principles. To the best of our knowledge, this work is the first to explore deep learning models for paraphrase generation. Our primary contribution is a stacked residual LSTM network, where we add residual connections between LSTM layers. This allows for efficient training of deep LSTMs. We evaluate our model and other state-of-the-art deep learning models on three different datascts: PPDB, WikiAnswers, and MSCOCO. Evaluation results demonstrate that our model outperforms sequence to sequence, attention-based, and bidirectional LSTM models on BLEU, METEOR, TER, and an embedding-based sentence similarity metric.
机译:在本文中,我们提出了一种用于解释短语生成的新颖神经方法。常规的复述生成方法要么利用手写规则和基于叙词表的对齐方式,要么使用统计机器学习原理。据我们所知,这项工作是第一个探索用于解释短语生成的深度学习模型的工作。我们的主要贡献是堆叠的残留LSTM网络,我们在LSTM层之间添加了残留连接。这允许对深度LSTM进行有效的训练。我们在以下三个不同的datasct上评估了我们的模型和其他最新的深度学习模型:PPDB,WikiAnswers和MSCOCO。评估结果表明,我们的模型在BLEU,METEOR,TER和基于嵌入的句子相似性度量标准上优于序列到序列,基于注意力和双向LSTM模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号