首页> 外文会议>Annual conference of the North American Chapter of the Association for Computational Linguistics: human language technologies >How Time Matters: Learning Time-Decay Attention for Contextual Spoken Language Understanding in Dialogues
【24h】

How Time Matters: Learning Time-Decay Attention for Contextual Spoken Language Understanding in Dialogues

机译:时间如何发挥作用:在对话中学习时间衰减注意语境口语理解

获取原文

摘要

Spoken language understanding (SLU) is an essential component in conversational systems. Most SLU components treat each utterance independently, and then the following components aggregate the multi-turn information in the separate phases. In order to avoid error propagation and effectively utilize contexts, prior work leveraged history for contextual SLU. However, most previous models only paid attention to the related content in history utterances, ignoring their temporal information. In the dialogues, it is intuitive that the most recent utterances are more important than the least recent ones, in other words, time-aware attention should be in a decaying manner. Therefore, this paper designs and investigates various types of time-decay attention on the sentence-level and speaker-level, and further proposes a flexible universal time-decay attention mechanism. The experiments on the benchmark Dialogue State Tracking Challenge (DSTC4) dataset show that the proposed time-decay attention mechanisms significantly improve the state-of-the-art model for contextual understanding performance1.
机译:口语理解(SLU)是会话系统中的重要组成部分。大多数SLU组件独立地处理每种发音,然后以下组件在单独的阶段中汇总多匝信息。为了避免错误传播并有效利用上下文,以前的工作将历史记录用于上下文SLU。但是,大多数以前的模型只关注历史话语中的相关内容,而忽略了它们的时间信息。在对话中,直觉上,最近的话语比最近的话语更重要,换句话说,时间感知的注意力应该以衰减的方式出现。因此,本文在句子层面和说话者层面设计并研究了各种类型的时间衰减注意,并进一步提出了一种灵活的通用时间衰减注意机制。在基准对话状态跟踪挑战(DSTC4)数据集上进行的实验表明,提出的时间衰减注意机制显着改善了上下文理解性能的最新模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号