首页> 外文会议>IEEE/WIC/ACM International Conference on Web Intelligence >Neural Word Representations from Large-Scale Commonsense Knowledge
【24h】

Neural Word Representations from Large-Scale Commonsense Knowledge

机译:大规模常识知识的神经词表示

获取原文

摘要

There has recently been a surge of research on neural network-inspired algorithms to produce numerical vector representations of words, based on contextual information. In this paper, we present an approach to improve such word embeddings by first mining cognitively salient word relationships from text and then using stochastic gradient descent to jointly optimize the embeddings to reflect this information, in addition to the regular contextual information captured by the word2vec CBOW objective. Our findings show that this new training regime leads to vectors that better reflect commonsense information about words.
机译:最近,基于上下文信息,最近对神经网络启发算法的研究产生了数值矢量表示。在本文中,我们通过第一挖掘文本的认知突出词关系来提出一种改进此类单词嵌入的方法,然后使用随机梯度下降来共同优化嵌入物以反映该信息,除了由Word2Vec Cow捕获的常规上下文信息之外客观的。我们的调查结果表明,这一新培训制度导致了更好地反映了关于言语的致辞信息的向量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号