首页> 外文会议>International conference on asian language processing >Learning word embeddings from dependency relations
【24h】

Learning word embeddings from dependency relations

机译:从抚养关系学习词嵌入

获取原文

摘要

Continuous-space word representation has demonstrated its effectiveness in many natural language pro-cessing(NLP) tasks. The basic idea for embedding training is to update embedding matrix based on its context. However, such context has been constrained on fixed surrounding words, which we believe are not sufficient to represent the actual relations for given center word. In this work we extend previous approach by learning distributed representations from dependency structure of a sentence which can capture long distance relations. Such context can learn better semantics for words, which is proved on Semantic-Syntactic Word Relationship task. Besides, competitive result is also achieved for dependency embeddings on WordSim-353 task.
机译:连续空间字表示已经证明了其许多自然语言的效果(NLP)任务。嵌入培训的基本思想是根据其上下文更新嵌入矩阵。然而,这种背景受到固定周围词的限制,我们认为不足以代表给定中心词的实际关系。在这项工作中,我们通过从判断长途关系的句子的依赖结构学习分布式表示来扩展先前的方法。这样的上下文可以了解更好的文字语义,这被证明是在语义语法中的关系任务上。此外,竞争结果还可以在WordsIM-353任务上依赖依赖嵌入。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号