首页> 外文会议>Annual meeting of the Association for Computational Linguistics;Meeting of the Association for Computational Linguistics >Minimized models and grammar-informed initialization for supertagging with highly ambiguous lexicons
【24h】

Minimized models and grammar-informed initialization for supertagging with highly ambiguous lexicons

机译:最小化模型和语法通知性的初始化,用于使用高度模糊​​的词典进行超级标记

获取原文

摘要

We combine two complementary ideas for learning supertaggers from highly ambiguous lexicons: grammar-informed tag transitions and models minimized via integer programming. Each strategy on its own greatly improves performance over basic expectation-maximization training with a bitag Hidden Markov Model, which we show on the CCGbank and CCG-TUT corpora. The strategies provide further error reductions when combined. We describe a new two-stage integer programming strategy that efficiently deals with the high degree of ambiguity on these datasets while obtaining the full effect of model minimization.
机译:我们结合了两个互补的思想,可以从高度模棱两可的词典中学习超级标语:语法通知的标签过渡和通过整数编程最小化的模型。通过使用bitag隐马尔可夫模型,每种策略都可以通过基本期望最大化培训极大地提高性能,我们在CCGbank和CCG-TUT语料库上展示了该模型。当组合时,这些策略可进一步减少错误。我们描述了一种新的两阶段整数规划策略,该策略可以有效处理这些数据集上的高度歧义,同时获得模型最小化的全部效果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号