首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >HKA: A Hierarchical Knowledge Attention Mechanism for Multi-Turn Dialogue System
【24h】

HKA: A Hierarchical Knowledge Attention Mechanism for Multi-Turn Dialogue System

机译:HKA:多轮对话系统的分层知识注意机制

获取原文

摘要

Generating informative responses by incorporating external knowledge into dialogue system attracts more and more attention. Most previous works facilitate single-turn dialogue system on generating such responses. However, few works focus on incorporating knowledge for multi-turn system, since the hierarchy of knowledge, from the words and utterances in context, is ignored. Motivated by this, we propose a novel hierarchical knowledge attention (HKA) mechanism for open-domain multi-turn dialogue system in this paper, which utilizes both word and utterance level attention jointly. Experiments demonstrate that the proposed HKA can incorporate more appropriate knowledge and make the state-of-the-art models generate more informative responses. Further analysis shows that our HKA can improve the model’s ability of dialogue state management, especially when the number of dialogue turns is large.
机译:通过将外部知识整合到对话系统中来产生信息响应引起了越来越多的关注。先前的大多数工作都促进了单回合对话系统产生这种反应。但是,很少有作品专注于为多回转系统合并知识,因为忽略了上下文中单词和话语的知识层次。为此,本文提出了一种新颖的用于开放域多回合对话系统的层次知识注意(HKA)机制,该机制同时利用了单词和话语水平注意。实验表明,拟议的HKA可以吸收更多适当的知识,并使最新模型产生更多的信息响应。进一步的分析表明,我们的HKA可以提高该模型的对话状态管理能力,尤其是在对话轮次很多的情况下。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号