首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Is 'Universal Syntax' Universally Useful for Learning Distributed Word Representations?
【24h】

Is 'Universal Syntax' Universally Useful for Learning Distributed Word Representations?

机译:是'Universal Syntax'普遍用来学习分布式字表示吗?

获取原文

摘要

Recent comparative studies have demonstrated the usefulness of dependency-based contexts (DEPS) for learning distributed word representations for similarity tasks. In English, DEPS tend to perform better than the more common, less informed bag-of-words contexts (BOW). In this paper, we present the first cross-linguistic comparison of different context types for three different languages. DEPS are extracted from "universal parses" without any language-specific optimization. Our results suggest that the universal DEPS (UDEPS) are useful for detecting functional similarity (e.g., verb similarity, solving syntactic analogies) among languages, but their advantage over BOW is not as prominent as previously reported on English. We also show that simple "post-parsing" filtering of useful UDEPS contexts leads to consistent improvements across languages.
机译:最近的比较研究已经证明了基于依赖性的上下文(DEP)用于学习相似任务的分布式字表示的有用性。在英语中,DEPS往往比更常见,更少知情的单词上下文(弓)更好地表现更好。在本文中,我们介绍了三种不同语言的不同上下文类型的第一个交叉语言比较。 DEPS从“通用解析”中提取,而无需任何语言特定的优化。我们的结果表明,通用DEPS(UDEPS)可用于检测语言之间的功能性相似性(例如,动词相似性,解决句法类比),但它们在船头上的优势并不像先前报告的英语报告一样突出。我们还显示简单的“解析后”过滤有用的UDEPS上下文导致语言的一致性改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号