首页> 外文会议>International conference on computational linguistics >Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation
【24h】

Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation

机译:向后和向前序列的顺序:生成性短文本对话的内容引入方法

获取原文

摘要

Using neural networks to generate replies in human-computer dialogue systems is attracting increasing attention over the past few years. However, the performance is not satisfactory: the neural network tends to generate safe, universally relevant replies which carry little meaning. In this paper, we propose a content-introducing approach to neural network-based generative dialogue systems. We first use pointwise mutual information (PMI) to predict a noun as a keyword, reflecting the main gist of the reply. We then propose seq2BF, a "sequence to backward and forward sequences" model, which generates a reply containing the given keyword. Experimental results show that our approach significantly outperforms traditional sequence-to-sequence models in terms of human evaluation and the entropy measure, and that the predicted keyword can appear at an appropriate position in the reply.
机译:在过去的几年中,使用神经网络在人机对话系统中生成答复引起了越来越多的关注。但是,性能并不令人满意:神经网络往往会生成安全的,普遍相关的答复,但意义不大。在本文中,我们提出了一种基于内容的方法,用于基于神经网络的生成对话系统。我们首先使用逐点互信息(PMI)来预测名词作为关键字,以反映答复的主要内容。然后,我们提出seq2BF,“向后和向前序列的序列”模型,该模型生成包含给定关键字的答复。实验结果表明,在人工评估和熵测度方面,我们的方法明显优于传统的序列到序列模型,并且预测的关键字可以出现在回复中的适当位置。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号