首页> 外文会议>Annual Conference of the International Speech Communication Association >A step beyond local observations with a dialog aware bidirectional GRU network for Spoken Language Understanding
【24h】

A step beyond local observations with a dialog aware bidirectional GRU network for Spoken Language Understanding

机译:具有对话对话的对话双向GRU网络的本地观测的步骤,用于口语语言理解

获取原文

摘要

Architectures of Recurrent Neural Networks (RNN) recently become a very popular choice for Spoken Language Understanding (SLU) problems; however, they represent a big family of different architectures that can furthermore be combined to form more complex neural networks. In this work, we compare different recurrent networks, such as simple Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM) networks, Gated Memory Units (GRU) and their bidirectional versions, on the popular ATIS dataset and on MEDIA, a more complex French dataset. Additionally, we propose a novel method where information about the presence of relevant word classes in the dialog history is combined with a bidirectional GRU, and we show that combining relevant word classes from the dialog history improves the performance over recurrent networks that work by solely analyzing the current sentence.
机译:经常性神经网络(RNN)的架构最近成为口语语言理解(SLU)问题的非常受欢迎的选择;然而,它们代表了一个大家庭的不同架构,其可以进一步组合以形成更复杂的神经网络。在这项工作中,我们比较了不同的经常性网络,例如简单的经常性神经网络(RNN),长短期内存(LSTM)网络,门控存储单元(GRU)及其双向版本,在流行的ATIS数据集和媒体上,一个更复杂的法国数据集。此外,我们提出了一种新颖的方法,其中关于对话历史中相关字类的存在的信息与双向gru组合,并且我们表明从对话历史中组合相关的单词类通过单独分析来提高与经常性网络的性能提高当前的句子。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号