首页> 外文会议>IEEE Conference on Computer Vision and Pattern Recognition >Temporal Attention-Gated Model for Robust Sequence Classification
【24h】

Temporal Attention-Gated Model for Robust Sequence Classification

机译:鲁棒序列分类的时间注意门控模型

获取原文

摘要

Typical techniques for sequence classification are designed for well-segmented sequences which have been edited to remove noisy or irrelevant parts. Therefore, such methods cannot be easily applied on noisy sequences expected in real-world applications. In this paper, we present the Temporal Attention-Gated Model (TAGM) which integrates ideas from attention models and gated recurrent networks to better deal with noisy or unsegmented sequences. Specifically, we extend the concept of attention model to measure the relevance of each observation (time step) of a sequence. We then use a novel gated recurrent network to learn the hidden representation for the final prediction. An important advantage of our approach is interpretability since the temporal attention weights provide a meaningful value for the salience of each time step in the sequence. We demonstrate the merits of our TAGM approach, both for prediction accuracy and interpretability, on three different tasks: spoken digit recognition, text-based sentiment analysis and visual event recognition.
机译:序列分类的典型技术是针对已细分的序列进行设计的,这些序列已被编辑以去除杂音或不相关的部分。因此,这样的方法不能容易地应用于现实应用中期望的嘈杂序列。在本文中,我们提出了时间注意门控模型(TAGM),该模型将注意模型和门控递归网络的思想整合在一起,以更好地处理嘈杂或不分段的序列。具体来说,我们扩展了注意力模型的概念,以测量序列的每个观察值(时间步长)的相关性。然后,我们使用新颖的门控递归网络来学习用于最终预测的隐藏表示。我们的方法的一个重要优点是可解释性,因为时间关注权重为序列中每个时间步的显着性提供了有意义的值。我们在三个不同的任务上展示了TAGM方法在预测准确性和可解释性方面的优点:语音数字识别,基于文本的情感分析和视觉事件识别。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号