首页> 外文会议>International Conference on Semantics, Knowledge and Grids >KB-Transformer: Incorporating Knowledge into End-to-End Task-Oriented Dialog Systems
【24h】

KB-Transformer: Incorporating Knowledge into End-to-End Task-Oriented Dialog Systems

机译:KB-Transformer:将知识纳入端到端面向任务的对话框系统

获取原文

摘要

Existing end-to-end task-oriented dialogue systems are lack of proper support of knowledge base, and training based on RNN seq2seq models is time-consuming. In this paper, we propose a novel parallel computing framework(KB-Transformer) to incorporate KB which can be trained faster than RNN seq2seq models. A key contribution is that we propose multi-head key-value memory network for the first time, comprehensively encoding semantic information of KB and incorporating KB into task-oriented dialog systems from multiple dimensions and sub-spaces. KB-Transformer tries to combine memory network with transformer, implementing a parallel dialogue framework based entirely on attention mechanism. As a result, we show that KB-Transformer can be trained faster and attain better performance than RNN seq2seq models on two different task-oriented dialog datasets.
机译:现有的端到端面向任务的对话系统缺乏对知识库的适当支持,并且基于RNN seq2seq模型的培训非常耗时。在本文中,我们提出了一种新颖的并行计算框架(KB-Transformer)来合并KB,该框架可以比RNN seq2seq模型更快地进行训练。一个关键的贡献是,我们首次提出了多头键值存储网络,它对KB的语义信息进行了全面编码,并将KB从多个维度和子空间引入到面向任务的对话系统中。 KB-Transformer尝试将内存网络与变压器结合,完全基于注意力机制实现并行对话框架。结果表明,在两个不同的面向任务的对话框数据集上,与RNN seq2seq模型相比,KB-Transformer可以训练得更快,并且可以获得更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号