首页> 外文会议>International Joint Conference on Artificial Intelligence >Meta-Learning for Low-resource Natural Language Generation in Task-oriented Dialogue Systems
【24h】

Meta-Learning for Low-resource Natural Language Generation in Task-oriented Dialogue Systems

机译:以任务为导向的对话系统中低资源自然语言生成的元学习

获取原文

摘要

Natural language generation (NLG) is an essential component of task-oriented dialogue systems. Despite the recent success of neural approaches for NLG, they are typically developed for particular domains with rich annotated training examples. In this paper, we study NLG in a low-resource setting to generate sentences in new scenarios with handful training examples. We formulate the problem from a meta-learning perspective, and propose a generalized optimization-based approach (Meta-NLG) based on the well-recognized model-agnostic meta-learning (MAML) algorithm. Meta-NLG defines a set of meta tasks, and directly incorporates the objective of adapting to new low-resource NLG tasks into the meta-learning optimization process. Extensive experiments are conducted on a large multi-domain dataset (MultiWoz) with diverse linguistic variations. We show that Meta-NLG significantly outperforms other training procedures in various low-resource configurations. We analyze the results, and demonstrate that Meta-NLG adapts extremely fast and well to low-resource situations.
机译:自然语言生成(NLG)是面向任务的对话系统的重要组成部分。尽管近期对NLG神经方法的成功,它们通常用于开发具有丰富注释的训练实例特定领域。在本文中,我们在低资源环境研究NLG产生与少数训练范例新方案的句子。我们制定从元学习的角度思考问题,并提出了基于公认的模型无关元学习(MAML)算法的广义基于优化的方法(元NLG)。元NLG定义了一组元的任务,并直接采用了客观适应新的低资源NLG任务到元学习的优化过程。广泛的实验是在一个大的多结构域的数据集(MultiWoz)与不同的语言的变化进行的。我们表明,元NLG显著优于各种低资源配置等训练程序。我们分析的结果,证明了元NLG适应非常快,很好的低资源情况。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号