首页> 外国专利> Multi-task learning as a question answering

Multi-task learning as a question answering

机译:多任务学习作为一个问题回答

摘要

The multi-task learning approach as a question response includes an input layer for encoding contexts and questions; a self-attention-based transformer that includes an encoder and decoder; and a first bidirectional for further encoding the encoder output. Long / short-term memory (biLSTM); long-short-term memory (LSTM) for generating context-adjusted hidden state from encoder output and hidden state; first based on first biLSTM output and LSTM output Includes an attention network for generating attention weights; a vocabulary layer that produces a vocabulary distribution state; a context layer that produces a context distribution state; a switch; Generate weights between the distribution states of, generate a compound distribution state based on the weights, and use the compound distribution state to select the answer word.
机译:作为问题响应的多任务学习方法包括用于编码上下文和问题的输入层; 基于自我关注的变压器,包括编码器和解码器; 以及用于进一步编码编码器输出的第一双向。 长/短期记忆(Bilstm); 长期内存(LSTM),用于从编码器输出和隐藏状态生成上下文调整的隐藏状态; 首先基于第一个BILSTM输出和LSTM输出,包括用于产生注意力的注意网络; 一种产生词汇分布状态的词汇层; 生成上下文分布状态的上下文层; 开关; 在分布状态之间生成权重,基于权重生成复合分布状态,并使用复合分布状态选择答案字。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号