首页> 外文会议> >CASA-NLU: Context-Aware Self-Attentive Natural Language Understanding for Task-Oriented Chatbots
【24h】

CASA-NLU: Context-Aware Self-Attentive Natural Language Understanding for Task-Oriented Chatbots

机译:CASA-NLU:面向任务的聊天机器人的上下文感知自关注自然语言理解

获取原文

摘要

Natural Language Understanding (NLU) is a core component of dialog systems. It typically involves two tasks - intent classification (IC) and slot labeling (SL), which are then followed by a dialogue management (DM) component. Such NLU systems cater to utterances in isolation, thus pushing the problem of context management to DM. However, contextual information is critical to the correct prediction of intents and slots in a conversation. Prior work on contextual NLU has been limited in terms of the types of contextual signals used and the understanding of their impact on the model. In this work, we propose a context-aware self-attentive NLU (casa-nlu) model that uses multiple signals, such as previous intents, slots, dialog acts and utterances over a variable context window, in addition to the current user utterance. casa-nlu outperforms a recurrent contextual NLU baseline on two conversational datasets, yielding a gain of up to 7% on the IC task for one of the datasets. Moreover, a non-contextual variant of casa-nlu achieves state-of-the-art performance for IC task on standard public datasets - Snips and ATIS.
机译:自然语言理解(NLU)是对话系统的核心组件。它通常涉及两个任务-意图分类(IC)和插槽标签(SL),然后是对话管理(DM)组件。这样的NLU系统孤立地满足话语,因此将上下文管理的问题推向了DM。但是,上下文信息对于正确预测对话中的意图和时段至关重要。就使用的上下文信号的类型及其对模型的影响的理解而言,有关上下文NLU的先前工作受到了限制。在这项工作中,我们提出了一个上下文感知的自我关注NLU(casa-nlu)模型,该模型使用多个信号,例如当前用户的话语,以及在可变上下文窗口上的先前意图,广告位,对话行为和话语。在两个会话数据集上,casa-nlu的表现优于循环上下文NLU基线,其中一个数据集的IC任务收益高达7%。此外,casa-nlu的非上下文变体在标准公共数据集Snips和ATIS上实现了IC任务的最新性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号