首页> 外文会议>International conference on recent advances in natural language processing >Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems
【24h】

Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems

机译:自我注意模型在面向任务的对话生成系统中的应用

获取原文
获取外文期刊封面目录资料

摘要

Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrence-based and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism. Self-attentional models have been used in the creation of the state-of-the-art models in many NLP tasks such as neural machine translation, but their usage has not been explored for the task of training end-to-end task-oriented dialogue generation systems yet. In this study, we apply these models on the three different datasets for training task-oriented chatbots. Our finding shows that self-attentional models can be exploited to create end-to-end task-oriented chatbots which not only achieve higher evaluation scores compared to recurrence-based models, but also do so more efficiently.
机译:自我注意模型是一种序列建模任务的新范例,与常规序列建模方法(例如基于递归和基于卷积的序列学习)不同,它们的体系结构仅基于注意力机制。自注意力模型已用于许多NLP任务(例如神经机器翻译)的最新模型的创建中,但尚未探索将其用于培训端到端面向任务的任务对话生成系统呢。在这项研究中,我们将这些模型应用于三个不同的数据集,以训练面向任务的聊天机器人。我们的发现表明,可以利用自我注意模型来创建端到端面向任务的聊天机器人,与基于递归的模型相比,该机器人不仅可以获得更高的评估分数,而且效率更高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号