首页>
外国专利>
STRUCTURE-PRESERVING ATTENTION MECHANISM IN SEQUENCE-TO-SEQUENCE NEURAL MODELS
STRUCTURE-PRESERVING ATTENTION MECHANISM IN SEQUENCE-TO-SEQUENCE NEURAL MODELS
展开▼
机译:序列到序列神经模型中的结构保护机制
展开▼
页面导航
摘要
著录项
相似文献
摘要
In a trained attentive decoder of a trained Sequence-to-Sequence (seq2seq) Artificial Neural Network (ANN): obtaining an encoded input vector sequence; generating, using a trained primary attention mechanism of the trained attentive decoder, a primary attention vectors sequence; for each primary attention vector of the primary attention vectors sequence: (a) generating a set of attention vector candidates corresponding to the respective primary attention vector, (b) evaluating, for each attention vector candidate of the set of attention vector candidates, a structure fit measure that quantifies a similarity of the respective attention vector candidate to a desired attention vector structure, (c) generating, using a trained soft-selection ANN, a secondary attention vector based on said evaluation and on state variables of the trained attentive decoder; and generating, using the trained attentive decoder, an output sequence based on the encoded input vector sequence and the secondary attention vectors.
展开▼