【24h】

From Balustrades to Pierre Vinken: Looking for Syntax in Transformer Self-Attentions

机译:从栏杆到Pierre Vinken:在变压器自注意力中寻找语法

获取原文

摘要

We inspect the multi-head self-attention in Transformer NMT encoders for three source languages, looking for patterns that could have a syntactic interpretation. In many of the attention heads, we frequently find sequences of consecutive states attending to the same position, which resemble syntactic phrases. We propose a transparent deterministic method of quantifying the amount of syntactic information present in the self-attentions, based on automatically building and evaluating phrase-structure trees from the phrase-like sequences. We compare the resulting trees to existing constituency treebanks, both manually and by computing precision and recall.
机译:我们在Transformer NMT编码器中检查三种源语言的多头自注意力,以寻找可能具有句法解释的模式。在许多注意事项中,我们经常发现出现在同一位置的连续状态序列,类似于语法短语。我们提出了一种透明的确定性方法,该方法基于自动构建和评估类似短语序列的短语结构树,从而量化自我注意中存在的句法信息量。我们通过人工以及通过计算精度和召回率将结果树与现有选区树库进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号