首页> 外文会议>International conference on asian language processing >Learning word embeddings from dependency relations
【24h】

Learning word embeddings from dependency relations

机译:从依赖关系中学习单词嵌入

获取原文

摘要

Continuous-space word representation has demonstrated its effectiveness in many natural language pro-cessing(NLP) tasks. The basic idea for embedding training is to update embedding matrix based on its context. However, such context has been constrained on fixed surrounding words, which we believe are not sufficient to represent the actual relations for given center word. In this work we extend previous approach by learning distributed representations from dependency structure of a sentence which can capture long distance relations. Such context can learn better semantics for words, which is proved on Semantic-Syntactic Word Relationship task. Besides, competitive result is also achieved for dependency embeddings on WordSim-353 task.
机译:连续空间字表示法已经证明了其在许多自然语言处理(NLP)任务中的有效性。嵌入训练的基本思想是根据其上下文更新嵌入矩阵。但是,这样的上下文已经限制在固定的周围单词上,我们认为这不足以代表给定中心单词的实际关系。在这项工作中,我们通过从可以捕获长距离关系的句子的依存结构中学习分布式表示来扩展先前的方法。这样的上下文可以更好地学习单词的语义,这在语义-句法词关系任务中得到了证明。此外,在WordSim-353任务上的依赖嵌入也取得了竞争性结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号