首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Is 'Universal Syntax' Universally Useful for Learning Distributed Word Representations?
【24h】

Is 'Universal Syntax' Universally Useful for Learning Distributed Word Representations?

机译:“通用语法”对于学习分布式单词表示形式是否普遍有用?

获取原文

摘要

Recent comparative studies have demonstrated the usefulness of dependency-based contexts (DEPS) for learning distributed word representations for similarity tasks. In English, DEPS tend to perform better than the more common, less informed bag-of-words contexts (BOW). In this paper, we present the first cross-linguistic comparison of different context types for three different languages. DEPS are extracted from "universal parses" without any language-specific optimization. Our results suggest that the universal DEPS (UDEPS) are useful for detecting functional similarity (e.g., verb similarity, solving syntactic analogies) among languages, but their advantage over BOW is not as prominent as previously reported on English. We also show that simple "post-parsing" filtering of useful UDEPS contexts leads to consistent improvements across languages.
机译:最近的比较研究表明,基于依赖的上下文(DEPS)对于学习相似任务的分布式单词表示形式很有用。在英语中,DEPS的表现要好于更常见,消息量较少的词袋语境(BOW)。在本文中,我们提出了三种不同语言的不同上下文类型的首次跨语言比较。 DEPS是从“通用分析”中提取的,而没有任何特定于语言的优化。我们的结果表明,通用DEPS(UDEPS)可用于检测语言之间的功能相似性(例如,动词相似性,解决句法类比),但它们在BOW方面的优势并不像以前在英语中报道的那样突出。我们还表明,对有用的UDEPS上下文进行简单的“后解析”过滤可导致跨语言的一致改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号