首页> 外文会议>Pacific-Rim Conference on Multimedia >Attention Window Aware Encoder-Decoder Model for Spoken Language Understanding
【24h】

Attention Window Aware Encoder-Decoder Model for Spoken Language Understanding

机译:注意窗口识别的语言理解编码器 - 解码器模型

获取原文

摘要

Slot filling task, which aims to predict the semantic slot labels for each specific word in word sequence, is one of the main tasks in Spoken Language Understanding (SLU). In this paper, we propose a variation of encoder-decoder model for sequence labelling. To better use the label dependency feature and prevent overfitting, we use Long Short Term Memory (LSTM) as encoder and Gated Recurrent Unit (GRU) as decoder. We also enhance the model by employing the attention mechanism with attention window as a novel feature, which considers the particularity in slot filling task that each target label corresponds to the specific words and hidden units in the encoder. We test the proposed model using the standard ATIS corpus by adopting different size of attention window. The analysis of trends for the results using different attention window size has shown its application potential of attention window feature.
机译:插槽填充任务,旨在预测单词序列中每个特定单词的语义插槽标签,是语言理解(SLU)中的主要任务之一。在本文中,我们提出了序列标记的编码器 - 解码器模型的变化。为了更好地使用标签依赖性功能并防止过度装备,我们使用长短期内存(LSTM)作为编码器和门控复发单元(GRU)作为解码器。我们还通过使用注意机制作为一种新颖特征来增强模型,这将每个目标标签对应于编码器中的特定单词和隐藏单元的时隙填充任务中的特殊性。我们通过采用不同大小的注意窗口,使用标准的ATIS语料库测试所提出的模型。使用不同关注窗口尺寸的结果趋势分析了注意力窗口特征的应用潜力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号