首页> 外国专利> STRUCTURE-PRESERVING ATTENTION MECHANISM IN SEQUENCE-TO-SEQUENCE NEURAL MODELS

STRUCTURE-PRESERVING ATTENTION MECHANISM IN SEQUENCE-TO-SEQUENCE NEURAL MODELS

机译:序列到序列神经模型中的结构保护机制

摘要

In a trained attentive decoder of a trained Sequence-to-Sequence (seq2seq) Artificial Neural Network (ANN): obtaining an encoded input vector sequence; generating, using a trained primary attention mechanism of the trained attentive decoder, a primary attention vectors sequence; for each primary attention vector of the primary attention vectors sequence: (a) generating a set of attention vector candidates corresponding to the respective primary attention vector, (b) evaluating, for each attention vector candidate of the set of attention vector candidates, a structure fit measure that quantifies a similarity of the respective attention vector candidate to a desired attention vector structure, (c) generating, using a trained soft-selection ANN, a secondary attention vector based on said evaluation and on state variables of the trained attentive decoder; and generating, using the trained attentive decoder, an output sequence based on the encoded input vector sequence and the secondary attention vectors.
机译:在培训的序列到序列(SEQ2SEQ)人工神经网络(ANN)的训练有素的细心解码器中:获得编码的输入矢量序列;通过训练有素的细心解码器的训练主要注意机制产生主要注意力矢量序列;对于主要注意载体的每个主要注意载体序列:(a)产生对应于相应的主要注意载体的一组注意载体,(b)评估,用于每组注意力传染媒介候选者,一种结构适合度量,其使用培训的软选择ANN,基于所述评估和培训的细心解码器的状态变量,使用训练的软选择ANN来定量相应注意载体候选者的相似性,(c)产生的培训软选择ANN。使用训练有素的细心解码器,基于编码的输入向量序列和次级注意向量的输出序列产生。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号