首页> 外文会议>Workhshop on NLP for Conversational AI >CopyBERT: A Unified Approach to Question Generation with Self-Attention
【24h】

CopyBERT: A Unified Approach to Question Generation with Self-Attention

机译:章法汇编:统一的问题方法,以自我关注

获取原文

摘要

Contextualized word embeddings provide better initialization for neural networks that deal with various natural language understanding (NLU) tasks including question answering (QA) and more recently, question generation (QG). Apart from providing meaningful word representations, pre-trained transformer models, such as BERT also provide self-attentions which encode syntactic information that can be probed for dependency parsing and POS-tagging. In this paper, we show that the information from self-attentions of BERT are useful for language modeling of questions conditioned on paragraph and answer phrases. To control the attention span, we use semi-diagonal mask and utilize a shared model for encoding and decoding, unlike sequence-to-sequence. We further employ copy mechanism over self-attentions to achieve state-of-the-art results for question generation on SQuAD dataset.
机译:上下文中的Word Embeddings为处理包括问题应答(QA)和最近,问题生成(QG)的各种自然语言理解(NLU)任务的神经网络提供了更好的初始化。除了提供有意义的单词表示之外,预先接受的变压器模型,例如BERT还提供了编码可以探测依赖解析和POS标记的句法信息的自我关注。在本文中,我们表明来自伯特的自我注意的信息对于在段落和答案短语上调节的问题的语言建模非常有用。为了控制注意力跨度,我们使用半对角线掩模并利用与序列到序列的共享模型来编码和解码。我们进一步聘请了对自我关注的复制机制,以实现阵容数据集的问题生成的最先进结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号