首页> 外文会议>Conference on empirical methods in natural language processing;EMNLP 2011 >Evaluating Dependency Parsing: Robust and Heuristics-Free Cross-Annotation Evaluation
【24h】

Evaluating Dependency Parsing: Robust and Heuristics-Free Cross-Annotation Evaluation

机译:评估依赖项解析:健壮且无启发式的交叉注释评估

获取原文

摘要

Methods for evaluating dependency parsing using attachment scores are highly sensitive to representational variation between dependency treebanks, making cross-experimental evaluation opaque. This paper develops a robust procedure for cross-experimental evaluation, based on deterministic unification-based operations for harmonizing different representations and a refined notion of tree edit distance for evaluating parse hypotheses relative to multiple gold standards. We demonstrate that, for different conversions of the Penn Treebank into dependencies, performance trends that are observed for parsing results in isolation change or dissolve completely when parse hypotheses are normalized and brought into the same common ground.
机译:使用附件分数评估依赖关系解析的方法对依赖关系树库之间的表示变化高度敏感,从而使跨实验评估变得不透明。本文基于用于统一不同表示的确定性基于统一的操作以及用于评估相对于多个黄金标准的解析假设的树编辑距离的精炼概念,开发了一种健壮的跨实验评估程序。我们证明,对于从Penn树库到依赖项的不同转换,在解析假设被归一化并置于相同的共同点时,观察到的性能趋势在进行解析时会导致隔离更改或完全溶解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号