首页> 外文会议>Natural language processing and Chinese computing >Recurrent Neural Networks with External Memory for Spoken Language Understanding
【24h】

Recurrent Neural Networks with External Memory for Spoken Language Understanding

机译:带有外部存储器的递归神经网络用于口语理解

获取原文
获取原文并翻译 | 示例

摘要

Recurrent Neural Networks (RNNs) have become increasingly popular for the task of language understanding. In this task, a semantic tagger is deployed to associate a semantic label to each word in an input sequence. The success of RNN may be attributed to its ability to memorise long-term dependence that relates the current-time semantic label prediction to the observations many time instances away. However, the memory capacity of simple RNNs is limited because of the gradient vanishing and exploding problem. We propose to use an external memory to improve memorisation capability of RNNs. Experiments on the ATIS dataset demonstrated that the proposed model was able to achieve the state-of-the-art results. Detailed analysis may provide insights for future research.
机译:递归神经网络(RNN)在语言理解方面变得越来越流行。在此任务中,部署了语义标记器,以将语义标记与输入序列中的每个单词相关联。 RNN的成功可能归因于其记忆长期依赖性的能力,该能力将当前语义标签预测与许多时间实例以外的观察结果相关联。但是,由于梯度消失和爆炸问题,简单的RNN的存储容量受到限制。我们建议使用外部存储器来提高RNN的记忆能力。在ATIS数据集上的实验表明,所提出的模型能够获得最新的结果。详细的分析可能会为将来的研究提供见解。

著录项

  • 来源
  • 会议地点 Nanchang(CN)
  • 作者单位

    Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, Hong Kong, Hong Kong;

    Microsoft Research, Hong Kong, Hong Kong;

    Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, Hong Kong, Hong Kong;

    Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, Hong Kong, Hong Kong;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号