首页> 外文会议>International joint conference on natural language processing >Capturing Long-range Contextual Dependencies with Memory-enhanced Conditional Random Fields
【24h】

Capturing Long-range Contextual Dependencies with Memory-enhanced Conditional Random Fields

机译:使用内存增强的条件随机字段捕获远程上下文依赖项

获取原文

摘要

Despite successful applications across a broad range of NLP tasks, conditional random fields ("CRFs"), in particular the linear-chain variant, are only able to model local features. While this has important benefits in terms of inference tractabil-ity, it limits the ability of the model to capture long-range dependencies between items. Attempts to extend CRFs to capture long-range dependencies have largely come at the cost of computational complexity and approximate inference. In this work, we propose an extension to CRFs by integrating external memory, taking inspiration from memory networks, thereby allowing CRFs to incorporate information far beyond neighbouring steps. Experiments across two tasks show substantial improvements over strong CRF and LSTM baselines.
机译:尽管在广泛的NLP任务中成功应用,但条件随机字段(“ CRF”),尤其是线性链变体,仅能够对局部特征建模。尽管这在推理可扩展性方面具有重要的好处,但它限制了模型捕获项目之间长期依赖关系的能力。试图扩展CRF以捕获远程依赖关系的尝试很大程度上是以计算复杂性和近似推理为代价的。在这项工作中,我们建议通过集成外部存储器来扩展CRF,并从存储网络中汲取灵感,从而使CRF可以合并远远超出相邻步骤的信息。两项任务的实验表明,相对于强大的CRF和LSTM基准而言,有了实质性的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号