首页> 外文会议>Annual Conference of the International Speech Communication Association >Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
【24h】

Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling

机译:基于关注的复发性神经网络模型,用于联合意图检测和槽填充

获取原文

摘要

Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition. In this work, we propose an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech understanding and dialog systems. Unlike in machine translation and speech recognition, alignment is explicit in slot filling. We explore different strategies in incorporating this alignment information to the encoder-decoder framework. Learning from the attention mechanism in encoder-decoder model, we further propose introducing attention to the alignment-based RNN models. Such attentions provide additional information to the intent classification and slot label prediction. Our independent task models achieve state-of-the-art intent detection error rate and slot filling F1 score on the benchmark ATIS task. Our joint training model further obtains 0.56% absolute (23.8% relative) error reduction on intent detection and 0.23% absolute gain on slot filling over the independent task models.
机译:基于注意力的编码器解码器神经网络模型最近显示了机器翻译和语音识别的有希望的结果。在这项工作中,我们提出了一种基于关注的神经网络模型,用于联合意图检测和插槽填充,这两者都是许多语音理解和对话系统的关键步骤。与机器翻译和语音识别不同,对齐在插槽填充中显式。我们探讨将该对齐信息结合到编码器解码器框架时的不同策略。从编码器 - 解码器模型中的注意机制学习,我们进一步提出了对基于对齐的RNN模型的关注。此类关注提供了意图分类和插槽标签预测的附加信息。我们的独立任务模型实现了最先进的意图检测错误率和填充F1分数的插槽填充了基准ATIS任务。我们的联合培训模型进一步获得了0.56%的绝对(相对)误差减少了0.56%,在Intent检测中降低了0.56%,并且在独立任务模型上填充了0.23%的绝对增益。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号