首页> 外国专利> Natural language generation through character-based recurrent neural networks with finite-state prior knowledge

Natural language generation through character-based recurrent neural networks with finite-state prior knowledge

机译:通过具有有限状态先验知识的基于字符的递归神经网络生成自然语言

摘要

A method and a system for generating a target character sequence from a semantic representation including a sequence of characters are provided. The method includes adapting a target background model, built from a vocabulary of words, to form an adapted background model. The adapted background model accepts subsequences of an input semantic representation as well as words from the vocabulary. The input semantic representation is represented as a sequence of character embeddings, which are input to an encoder. The encoder encodes each of the character embeddings to generate a respective character representation. A decoder then generates a target sequence of characters, based on the set of character representations. At a plurality of time steps, a next character in the target sequence is selected as a function of a previously generated character(s) of the target sequence and the adapted background model.
机译:提供了一种用于从包括字符序列的语义表示中生成目标字符序列的方法和系统。该方法包括适配从单词的词汇建立的目标背景模型,以形成适配的背景模型。适应的背景模型接受输入语义表示的子序列以及词汇表中的单词。输入的语义表示表示为一系列字符嵌入,这些字符嵌入被输入到编码器。编码器对每个字符嵌入进行编码以生成相应的字符表示。然后,解码器根据字符表示集生成目标字符序列。在多个时间步上,根据目标序列和适应的背景模型的先前生成的一个或多个字符来选择目标序列中的下一个字符。

著录项

  • 公开/公告号US10049106B2

    专利类型

  • 公开/公告日2018-08-14

    原文格式PDF

  • 申请/专利权人 XEROX CORPORATION;

    申请/专利号US201715408526

  • 发明设计人 RAGHAV GOYAL;MARC DYMETMAN;

    申请日2017-01-18

  • 分类号G06F17/27;G06F17/28;G10L25/30;

  • 国家 US

  • 入库时间 2022-08-21 13:05:23

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号