首页> 外文会议>International Conference on Computational Linguistics >Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer
【24h】

Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer

机译:知识意识到通过多任务增量变压器的文本对话中的情感认可

获取原文

摘要

Emotion recognition in textual conversations (ERTC) plays an important role in a wide range of applications, such as opinion mining, recommender systems, and so on. ERTC, however, is a challenging task. For one thing, speakers often rely on the context and commonsense knowledge to express emotions; for another, most utterances contain neutral emotion in conversations, as a result, the confusion between a few non-neutral utterances and much more neutral ones restrains the emotion recognition performance. In this paper, we propose a novel Knowledge Aware Incremental Transformer with Multi-task Learning (KAITML) to address these challenges. Firstly, we devise a dual-level graph attention mechanism to leverage commonsense knowledge, which augments the semantic information of the utterance. Then we apply the Incremental Transformer to encode multi-turn contextual utterances. Moreover, we are the first to introduce multi-task learning to alleviate the aforementioned confusion and thus further improve the emotion recognition performance. Extensive experimental results show that our KAITML model outperforms the state-of-the-art models across five benchmark datasets.
机译:文本对话中的情感认可(ERTC)在广泛的应用中起着重要作用,例如意见挖掘,推荐系统等。然而,ERTC是一个具有挑战性的任务。一方面,发言者经常依靠上下文和致辞来表达情绪;对于另一个,大多数话语在谈话中含有中性情绪,因此,几个非中性话语与更高的中性的混淆抑制了情绪识别性能。在本文中,我们提出了一种具有多任务学习(Kaitml)的新颖知识意识增量变压器来解决这些挑战。首先,我们设计了双层图的关注机制,以利用致辞知识,增强了话语的语义信息。然后我们应用增量变压器以编码多转的上下文话语。此外,我们是第一个引入多任务学习,以缓解上述混淆,从而进一步提高情感识别性能。广泛的实验结果表明,我们的kaitml模型跨越五个基准数据集更优于最先进的模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号