首页> 外文会议>Conference of the European Chapter of the Association for Computational Linguistics >A Little Pretraining Goes a Long Way: A Case Study on Dependency Parsing Task for Low-resource Morphologically Rich Languages
【24h】

A Little Pretraining Goes a Long Way: A Case Study on Dependency Parsing Task for Low-resource Morphologically Rich Languages

机译:一点预先威胁有很长的路要走:以低资源形态丰富语言的依赖解析任务为例

获取原文

摘要

Neural dependency parsing has achieved remarkable performance for many domains and languages. The bottleneck of massive labeled data limits the effectiveness of these approaches for low resource languages. In this work, we focus on dependency parsing for morphological rich languages (MRLs) in a low-resource setting. Although morphological information is essential for the dependency parsing task, the morphological disambiguation and lack of powerful analyzers pose challenges to get this information for MRLs. To address these challenges, we propose simple auxiliary tasks for pretraining. We perform experiments on 10 MRLs in low-resource settings to measure the efficacy of our proposed pretraining method and observe an average absolute gain of 2 points (UAS) and 3.6 points (LAS).
机译:神经依赖解析对许多域和语言具有显着性能。 大规模标记数据的瓶颈限制了这些方法对低资源语言的有效性。 在这项工作中,我们专注于在低资源设置中对形态丰富语言(MRLS)的依赖性解析。 虽然形态学信息对于依赖解析任务至关重要,但形态歧义和缺乏强大的分析仪对MRLS的信息构成了挑战。 为了解决这些挑战,我们提出了简单的辅助任务来预先预订。 我们在低资源环境中对10 MRLS进行实验,以衡量我们提出的预测方法的功效,并观察2分(UA)和3.6点(LAS)的平均绝对增益。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号