您现在的位置:首页> 外文会议>Annual meeting of the Society for Computation in Linguistics >文献详情

Jabberwocky Parsing: Dependency Parsing with Lexical Noise

机器翻译Jabberwocky解析:依赖于词汇噪声的解析

原文传递 原文传递并翻译 加入购物车 收藏
3.3 【6hr】

【摘要】Parsing models have long benefited from the use of lexical information, and indeed current state-of-the art neural network models for dependency parsing achieve substantial improvements by benefiting from distributed representations of lexical information. At the same time, humans can easily parse sentences with unknown or even novel words, as in Lewis Carroll's poem Jabberwocky. In this paper, we carry out jabberwocky parsing experiments, exploring how robust a state-of-the-art neural network parser is to the absence of lexical information. We find that current parsing models, at least under usual training regimens, are in fact overly dependent on lexical information, and perform badly in the jabberwocky context. We also demonstrate that the technique of word dropout drastically improves parsing robustness in this setting, and also leads to significant improvements in out-of-domain parsing.

【作者】Jungo Kasai; Robert Frank;

【作者单位】University of Washington; Yale University;

【年(卷),期】2019,,

【页码】113-123

【总页数】11

【正文语种】eng

【中图分类】;

【关键词】;