【24h】

A Knowledge Resources Based Neural Network for Learning Word and Relation Representations

机译:基于知识资源的神经网络,用于学习单词和关系表示

获取原文
获取原文并翻译 | 示例

摘要

Using neural networks to train high quality distributed representations of words and multi-relational data has attracted a great attention in recent years. Mapping the words and their relations to low-dimensional continues vector spaces has proved to be useful in natural language processing and information extraction tasks. In this paper, we present a neural network based model that can train word embeddings and relation embeddings taking into account unlabeled text data and knowledge resources jointly. In particular, we use both contexts and definitions of words as neural network inputs to train word embeddings. Based on the word embeddings, we train relation embeddings by defining a proper projecting operation between words. Experiments on various tasks like word similarity and link prediction show that the proposed method can achieve high quality on word and relation representations.
机译:近年来,使用神经网络训练单词和多关系数据的高质量分布式表示形式引起了极大的关注。事实证明,将单词及其与低维连续向量空间的关系映射对自然语言处理和信息提取任务很有用。在本文中,我们提出了一个基于神经网络的模型,该模型可以结合未标记的文本数据和知识资源来训练单词嵌入和关系嵌入。特别是,我们将单词的上下文和定义都用作神经网络输入来训练单词嵌入。基于单词嵌入,我们通过定义单词之间的适当投影操作来训练关系嵌入。在单词相似度和链接预测等各种任务上的实验表明,该方法可以在单词和关系表示上达到高质量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号