首页> 外文会议>Conference on empirical methods in natural language processing >Learning What's Easy: Fully Differentiable Neural Easy-First Taggers
【24h】

Learning What's Easy: Fully Differentiable Neural Easy-First Taggers

机译:学习简单易学:完全可区分的神经“先入先出”匕首

获取原文

摘要

We introduce a novel neural easy-first decoder that learns to solve sequence tagging tasks in a flexible order In contrast to previous easy-first decoders, our models are end-to-end differentiable. The decoder iteratively updates a "sketch" of the predictions over the sequence At its core is an attention mechanism that controls which parts of the input are strategically the best to process next We present a new constrained softmax transformation that ensures the same cumulative attention to every word, and show how to efficiently evaluate and backpropagate over it. Our models compare favourably to BILSTM taggers on three sequence tagging tasks.
机译:我们介绍了一种新颖的神经“易先”解码器,该算法学习以灵活的顺序解决序列标记任务。与以前的“易先”解码器相比,我们的模型是端到端可微的。解码器在序列上迭代更新预测的“草图”。其核心是一种注意力控制机制,该机制控制输入的哪些部分在策略上最适合下一步处理。我们提出了一种新的受约束的softmax变换,可确保对每个组件的注意力都相同单词,并展示如何有效地对其进行评估和反向传播。在三个序列标记任务上,我们的模型优于BILSTM标记器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号