首页> 外文会议>International Congress on Sound and Vibration >RECURRENT NEURAL NETWORK LANGUAGE MODEL WITH VECTOR-SPACE WORD REPRESENTATIONS
【24h】

RECURRENT NEURAL NETWORK LANGUAGE MODEL WITH VECTOR-SPACE WORD REPRESENTATIONS

机译:与传染媒介空间词表示的经常性神经网络语言模型

获取原文

摘要

Recurrent neural network language model (RNNLM) has been proved to be more competitive than other neural network language models. However, the input-layer of most current RNNLMs only uses one single feature i.e. the index of the word, which is a unit vector. Previous studies proved that language models with additional linguistic information achieve better performance. In this study, the vector space word representations (word vector), which can capture syntactic and semantic regularities of language, is used as additional features in order to enhance RNNLM. Finally, experimental results showed that the word vector features is very useful to improve the performance of RNNLM. Evaluated on a Mandarin test set, 10% relative reduction on perplexity could be obtained and 0.5 points absolute character error rate reductions could be obtained, compared to the conventional RNNLM.
机译:经常性神经网络语言模型(RNNLM)已被证明比其他神经网络语言模型更具竞争力。然而,大多数当前RNNLMS的输入层仅使用一个单个特征即单词,是单位向量的单词的索引。以前的研究证明,具有额外语言信息的语言模型实现了更好的性能。在本研究中,可以捕获语法和语义规律的传染媒介空间字表示(Word Vector)作为其他功能,以增强RNNLM。最后,实验结果表明,单词矢量特征对于提高RNNLM的性能非常有用。在普通话测试中评估,与传统的RNNLM相比,可以获得10%的相对降低相对降低的相对降低,并且可以获得0.5点绝对的字符错误率降低。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号